trust in automation: designing for appropriate reliancecsl/publications/pdf/leesee04.pdfcombination...

31
INTRODUCTION Sophisticated automation is becoming ubiq- uitous, appearing in work environments as di- verse as aviation, maritime operations, process control, motor vehicle operation, and informa- tion retrieval. Automation is technology that actively selects data, transforms information, makes decisions, or controls processes. Such technology exhibits tremendous potential to ex- tend human performance and improve safety; however, recent disasters indicate that it is not uniformly beneficial. On the one hand, people may trust automation even when it is not appro- priate. Pilots, trusting the ability of the auto- pilot, failed to intervene and take manual control even as the autopilot crashed the Airbus A320 they were flying (Sparaco, 1995). In another in- stance, an automated navigation system mal- functioned and the crew failed to intervene, allowing the Royal Majesty cruise ship to drift off course for 24 hours before it ran aground (Lee & Sanquist, 2000; National Transportation Safety Board, 1997). On the other hand, people are not always willing to put sufficient trust in automa- tion. Some operators rejected automated con- trollers in paper mills, undermining the potential benefits of the automation (Zuboff, 1988). As automation becomes more prevalent, poor part- nerships between people and automation will become increasingly costly and catastrophic. Such flawed partnerships between automa- tion and people can be described in terms of misuse and disuse of automation (Parasuraman & Riley, 1997). Misuse refers to the failures that occur when people inadvertently violate critical assumptions and rely on automation inappro- priately, whereas disuse signifies failures that occur when people reject the capabilities of automation. Misuse and disuse are two exam- ples of inappropriate reliance on automation that can compromise safety and profitability. Although this paper describes reliance on auto- mation as a discrete process of engaging or dis- engaging, automation can be a very complex combination of many modes, and reliance is of- ten a more graded process. Automation reliance is not a simple binary process, but the simplifica- tion makes the discussion of misuse and disuse more tractable. Understanding how to mitigate Trust in Automation: Designing for Appropriate Reliance John D. Lee and Katrina A. See, University of Iowa, Iowa City, Iowa Automation is often problematic because people fail to rely upon it appropriately. Because people respond to technology socially, trust influences reliance on automa- tion. In particular, trust guides reliance when complexity and unanticipated situa- tions make a complete understanding of the automation impractical. This review considers trust from the organizational, sociological, interpersonal, psychological, and neurological perspectives. It considers how the context, automation character- istics, and cognitive processes affect the appropriateness of trust. The context in which the automation is used influences automation performance and provides a goal-oriented perspective to assess automation characteristics along a dimension of attributional abstraction. These characteristics can influence trust through ana- lytic, analogical, and affective processes. The challenges of extrapolating the concept of trust in people to trust in automation are discussed. A conceptual model inte- grates research regarding trust in automation and describes the dynamics of trust, the role of context, and the influence of display characteristics. Actual or potential applications of this research include improved designs of systems that require people to manage imperfect automation. Address correspondence to John D. Lee, 2130 Seamans Center, Department of Mechanical and Industrial Engineering, University of Iowa, Iowa City, IA 52245; [email protected]. HUMAN FACTORS, Vol. 46, No. 1, Spring 2004, pp. 50–80. Copyright © 2004, Human Factors and Ergonomics Society. All rights reserved.

Upload: others

Post on 18-Mar-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

INTRODUCTION

Sophisticated automation is becoming ubiq-uitous appearing in work environments as di-verse as aviation maritime operations processcontrol motor vehicle operation and informa-tion retrieval Automation is technology thatactively selects data transforms informationmakes decisions or controls processes Suchtechnology exhibits tremendous potential to ex-tend human performance and improve safetyhowever recent disasters indicate that it is notuniformly beneficial On the one hand peoplemay trust automation even when it is not appro-priate Pilots trusting the ability of the auto-pilot failed to intervene and take manual controleven as the autopilot crashed the Airbus A320they were flying (Sparaco 1995) In another in-stance an automated navigation system mal-functioned and the crew failed to interveneallowing the Royal Majesty cruise ship to drift offcourse for 24 hours before it ran aground (Lee ampSanquist 2000 National Transportation SafetyBoard 1997) On the other hand people are notalways willing to put sufficient trust in automa-

tion Some operators rejected automated con-trollers in paper mills undermining the potentialbenefits of the automation (Zuboff 1988) Asautomation becomes more prevalent poor part-nerships between people and automation willbecome increasingly costly and catastrophic

Such flawed partnerships between automa-tion and people can be described in terms ofmisuse and disuse of automation (Parasuramanamp Riley 1997) Misuse refers to the failures thatoccur when people inadvertently violate criticalassumptions and rely on automation inappro-priately whereas disuse signifies failures thatoccur when people reject the capabilities ofautomation Misuse and disuse are two exam-ples of inappropriate reliance on automationthat can compromise safety and profitabilityAlthough this paper describes reliance on auto-mation as a discrete process of engaging or dis-engaging automation can be a very complexcombination of many modes and reliance is of-ten a more graded process Automation relianceis not a simple binary process but the simplifica-tion makes the discussion of misuse and disusemore tractable Understanding how to mitigate

Trust in Automation Designing for Appropriate Reliance

John D Lee and Katrina A See University of Iowa Iowa City Iowa

Automation is often problematic because people fail to rely upon it appropriatelyBecause people respond to technology socially trust influences reliance on automa-tion In particular trust guides reliance when complexity and unanticipated situa-tions make a complete understanding of the automation impractical This reviewconsiders trust from the organizational sociological interpersonal psychologicaland neurological perspectives It considers how the context automation character-istics and cognitive processes affect the appropriateness of trust The context inwhich the automation is used influences automation performance and provides agoal-oriented perspective to assess automation characteristics along a dimensionof attributional abstraction These characteristics can influence trust through ana-lytic analogical and affective processes The challenges of extrapolating the conceptof trust in people to trust in automation are discussed A conceptual model inte-grates research regarding trust in automation and describes the dynamics of trustthe role of context and the influence of display characteristics Actual or potentialapplications of this research include improved designs of systems that require peopleto manage imperfect automation

Address correspondence to John D Lee 2130 Seamans Center Department of Mechanical and Industrial EngineeringUniversity of Iowa Iowa City IA 52245 jdleeengineeringuiowaedu HUMAN FACTORS Vol 46 No 1 Spring 2004pp 50ndash80 Copyright copy 2004 Human Factors and Ergonomics Society All rights reserved

TRUST IN AUTOMATION 51

disuse and misuse of automation is a criticallyimportant problem with broad ramifications

Recent research suggests that misuse and dis-use of automation may depend on certain feel-ings and attitudes of users such as trust This isparticularly important as automation becomesmore complex and goes beyond a simple toolwith clearly defined and easily understood be-haviors In particular many studies show thathumans respond socially to technology and re-actions to computers can be similar to reactionsto human collaborators (Reeves amp Nass 1996)For example the similarity-attraction hypothesisin social psychology predicts that people withsimilar personality characteristics will be attract-ed to each other (Nass amp Lee 2001)

This finding also predicts user acceptance ofsoftware (Nass amp Lee 2001 Nass Moon FoggReeves amp Dryer 1995) Software that displayspersonality characteristics similar to those ofthe user tends to be more readily accepted Forexample computers that use phrases such asldquoYou should definitely do thisrdquo will tend to ap-peal to dominant users whereas computers thatuse less directive language such as ldquoPerhapsyou should do thisrdquo tend to appeal to submis-sive users (Nass amp Lee) Similarly the conceptof affective computing suggests that computersthat can sense and respond to usersrsquo emotionalstates may greatly improve human-computerinteraction (Picard 1997) More recently theconcept of computer etiquette suggests thathuman-computer interactions can be enhancedby recognizing how the social and work contextsinteract with the roles of the computer andhuman to specify acceptable behavior (Miller2002) More generally designs that consideraffect are likely to enhance productivity andacceptance (Norman Ortony amp Russell 2003)Together this research suggests that the emotion-al and attitudinal factors that influence human-human relationships may also contribute tohuman-automation relationships

Trust a social psychological concept seemsparticularly important for understanding human-automation partnerships Trust can be definedas the attitude that an agent will help achieve anindividualrsquos goals in a situation characterizedby uncertainty and vulnerability In this defini-tion an agent can be automation or anotherperson that actively interacts with the environ-

ment on behalf of the person Considerable re-search has shown the attitude of trust to be im-portant in mediating how people rely on eachother (Deutsch 1958 1960 Rempel Holmesamp Zanna 1985 Ross amp LaCroix 1996 Rotter1967) Sheridan (1975) and Sheridan and Hen-nessy (1984) argued that just as trust mediatesrelationships between people it may also medi-ate the relationship between people and auto-mation Many studies have demonstrated thattrust is a meaningful concept to describe human-automation interaction in both naturalistic(Zuboff 1988) and laboratory settings (HalprinJohnson amp Thornburry 1973 Lee amp Moray1992 Lewandowsky Mundy amp Tan 2000 Muir1989 Muir amp Moray 1996) These observationsdemonstrate that trust is an attitude toward au-tomation that affects reliance and that it can bemeasured consistently People tend to rely onautomation they trust and tend to reject auto-mation they do not By guiding reliance trusthelps to overcome the cognitive complexity peo-ple face in managing increasingly sophisticatedautomation

Trust guides ndash but does not completely deter-mine ndash reliance and the recent surge in researchrelated to trust and reliance has produced manyconfusing and seemingly conflicting findingsAlthough many recent articles have describedthe role of trust in mediating reliance on auto-mation there has been no integrative review ofthese studies The purpose of this paper is toprovide such a review link trust in automationto the burgeoning research on trust in otherdomains and resolve conflicting findings Webegin by developing a conceptual model to linkorganizational sociological interpersonal psy-chological and neurological perspectives ontrust between people to human-automation trustWe then use this conceptual model of trust andreliance to integrate research related to human-automation trust The conceptual model identi-fies important research issues and it also identifiesdesign evaluation and training approaches topromote appropriate trust and reliance

TRUST AND AFFECT

Researchers from a broad range of disci-plines have examined the role of trust in mediat-ing relationships between individuals between

52 Spring 2004 ndash Human Factors

individuals and organizations and even betweenorganizations Specifically trust has been inves-tigated as a critical factor in interpersonal rela-tionships where the focus is often on romanticrelationships (Rempel et al 1985) In exchangerelationships another important research areathe focus is on trust between management andemployees or between supervisors and subor-dinates (Tan amp Tan 2000) Trust has also beenidentified as a critical factor in increasing orga-nizational productivity and strengthening or-ganizational commitment (Nyhan 2000) Trustbetween firms and customers has become animportant consideration in the context of rela-tionship management (Morgan amp Hunt 1994)and Internet commerce (Muller 1996) Research-ers have even considered the issue of trust in thecontext of the relationship between organiza-tions such as those in multinational firms (Ringamp Vandeven 1992) in which cross-disciplinaryand cross-cultural collaboration is critical (DoneyCannon amp Mullen 1998)

Interest in trust has grown dramatically inthe last 5 years as many have come to recognizeits importance in promoting efficient transac-tions and cooperation Trust has emerged as acentral focus of organizational theory (Kramer1999) has been the focus of recent special issuesof the Academy of Management Review (Jonesamp George 1998) and the International Journalof Human-Computer Studies (Corritore Kracheramp Wiedenbeck 2003b) was the topic of a work-shop at the CHI 2001 meeting (Corritore Kra-cher amp Wiedenbeck 2001a) and has been thetopic of books such as that by Kramer and Tyler(1996)

The general theme of the increasing cogni-tive complexity of automation organizationsand interpersonal interactions explains therecent interest in trust Trust tends to be lessimportant in well-structured stable environ-ments such as procedure-based hierarchicalorganizations in which an emphasis on orderand stability minimize transactional uncertain-ty (Moorman Deshpande amp Zaltman 1993)Many organizations however have recentlyadopted agile structures self-directed workgroups matrix structures and complex automa-tion all of which make the workplace increas-ingly complex unstable and uncertain Becausethese changes enable rapid adaptation to change

and accommodate unanticipated variabilitythere is a trend away from well-structuredprocedure-based environments Although thesechanges have the potential to make organiza-tions and individuals more productive and ableto adapt to the unanticipated (Vicente 1999)they also increase cognitive complexity andleave more degrees of freedom for the individualto resolve Trust plays a critical role in peoplersquosability to accommodate the cognitive complexityand uncertainty that accompanies the moveaway from highly structured organizations andsimple technology

Trust helps people to accommodate com-plexity in several ways It supplants supervisionwhen direct observation becomes impracticaland it facilitates choice under uncertainty byacting as a social decision heuristic (Kramer1999) It also reduces uncertainty in gauging theresponses of others thereby guiding appropriatereliance and generating a collaborative advan-tage (Baba 1999 Ostrom 1998) Moreovertrust facilitates decentralization and adaptivebehavior by making it possible to replace fixedprotocols reporting structures and procedureswith goal-related expectations regarding thecapabilities of others The increased complexityand uncertainty that has inspired the recent in-terest in trust in other fields parallels theincreased complexity and sophistication ofautomation Trust in automation guides reliancewhen the complexity of the automation makesa complete understanding impractical and whenthe situation demands adaptive behavior thatprocedures cannot guide For this reason therecent interest in trust in other disciplines pro-vides a rich and appropriate theoretical basefor understanding how trust mediates relianceon complex automation and more generallyhow it affects computer-mediated collaborationthat involves both human and computer agents

Definition of Trust Beliefs AttitudesIntentions and Behavior

Not surprisingly the diverse interest in trusthas generated many definitions This is particu-larly true when considering how trust relates toautomation (Cohen Parasuraman amp Freeman1999 Muir1994) By examining the differencesand common themes of these definitions it is

TRUST IN AUTOMATION 53

possible to identify critical considerations forunderstanding the role of trust in mediatinghuman-automation interaction Some research-ers focus on trust as an attitude or expectationand they tend to define trust in one of the follow-ing ways ldquoexpectancy held by an individual thatthe word promise or written communicationof another can be relied uponrdquo (Rotter 1967p 651) ldquoexpectation related to subjective proba-bility an individual assigns to the occurrence ofsome set of future eventsrdquo (Rempel et al 1985p 96) ldquoexpectation of technically competentrole performancerdquo (Barber 1983 p 14) or ldquoex-pectations of fiduciary obligation and responsi-bility that is the expectation that some othersin our social relationships have moral obligationsand responsibility to demonstrate a special con-cern for othersrsquo interests above their ownrdquo (Bar-ber p 14) These definitions all include theelement of expectation regarding behaviors oroutcomes Clearly trust concerns an expectan-cy or an attitude regarding the likelihood offavorable responses

Another common approach characterizestrust as an intention or willingness to act Thisgoes beyond attitude in that trust is character-ized as an intention to behave in a certain man-ner or to enter into a state of vulnerability Forexample trust has been defined as ldquowillingnessto place oneself in a relationship that establishesor increases vulnerability with the reliance uponsomeone or something to perform as expectedrdquo(Johns 1996 p 81) ldquowillingness to rely on anexchange partner in whom one has confidencerdquo(Moorman et al 1993 p 82) and ldquowillingnessof a party to be vulnerable to the actions ofanother party based on the expectation that theother will perform a particular action impor-tant to the trustor irrespective of the ability tomonitor or control that partyrdquo (Mayer Davisamp Schoorman 1995 p 712)

The definition by Mayer et al (1995) is themost widely used and accepted definition oftrust (Rousseau Sitkin Burt amp Camerer 1998)As of April 2003 the Institute for Scientific In-formation citation database showed 203 cita-tions of this article far more than others on thetopic of trust The definition identifies vulnera-bility as a critical element of trust For trust tobe an important part of a relationship individ-uals must willingly put themselves at risk or in

vulnerable positions by delegating responsibili-ty for actions to another party

Some authors go beyond intention and definetrust as a behavioral result or state of vulnerabil-ity or risk (Deutsch 1960 Meyer 2001) Ac-cording to these definitions trust is the outcomeof actions that place people into certain states orsituations It can be seen as for example ldquoa stateof perceived vulnerability or risk that is derivedfrom an individualrsquos uncertainty regarding themotives intentions and perspective actions ofothers on whom they dependrdquo (Kramer 1999p 571)

These definitions highlight some importantinconsistencies regarding whether trust is abelief attitude intention or behavior Thesedistinctions are of great theoretical importanceas multiple factors mediate the process of trans-lating beliefs and attitudes into behaviors

Ajzen and Fishbein (1980 Fishbein amp Ajzen1975) developed a framework that can helpreconcile these conflicting definitions of trustTheir framework shows that behaviors resultfrom intentions and that intentions are a func-tion of attitudes Attitudes in turn are based onbeliefs According to this framework beliefsand perceptions represent the information basethat determines attitudes The availability ofinformation and the personrsquos experiences influ-ence beliefs An attitude is an affective evalua-tion of beliefs that guides people to adopt aparticular intention Intentions then translateinto behavior according to the environmentaland cognitive constraints a person faces In thecontext of trust and reliance trust is an attitudeand reliance is a behavior This framework keepsbeliefs attitudes intentions and behavior con-ceptually distinct and can help explain the in-fluence of trust on reliance According to thisframework trust affects reliance as an attituderather than as a belief intention or behaviorBeliefs underlie trust and various intentionsand behaviors may result from different levelsof trust

Considering trust as an intention or behaviorhas the potential to confuse its effect with theeffects of other factors that can influence behav-ior such as workload situation awareness andself-confidence of the operator (Lee amp Moray1994 Riley 1994) Trust is not the only factormediating the relationship between beliefs and

54 Spring 2004 ndash Human Factors

behavior Other psychological system and en-vironmental constraints intervene such as whenoperators do not have enough time to engagethe automation even though they trust it andintend to use it or when the effort to engagethe automation outweighs its benefits (Kirlik1993) Trust stands between beliefs about thecharacteristics of the automation and the inten-tion to rely on the automation

Many definitions of trust also indicate theimportance of the goal-oriented nature of trustAlthough many definitions do not identify thisaspect of trust explicitly several mention theability of the trustee to perform an important ac-tion and the expectation of the trustor that thetrustee will perform as expected or can be reliedupon (Gurtman 1992 Johns 1996 Mayer etal 1995) These definitions describe the basisof trust in terms of the performance of an agentthe trustee who furthers the goals of an indi-vidual the trustor In this way trust describes a

relationship that depends on the characteristicsof the trustee the trustor and the goal-relatedcontext of the interaction Trust is not a consid-eration in situations where the trustor does notdepend on the trustee to perform some functionrelated to the trustorrsquos goals

A simple definition of trust consistent withthese considerations is the attitude that anagent will help achieve an individualrsquos goals ina situation characterized by uncertainty andvulnerability This basic definition must beelaborated to consider the appropriateness oftrust the influence of context the goal-relatedcharacteristics of the agent and the cognitiveprocesses that govern the development anderosion of trust Figure 1 shows how these fac-tors interact in a dynamic process of relianceand the following sections elaborate on variouscomponents of this conceptual model

First we will consider the appropriatenessof trust In Figure 1 appropriateness is shown

Figure1The interaction of context agent characteristics and cognitive properties with the appropriateness of trust

TRUST IN AUTOMATION 55

as the relationship between the true capabili-ties of the agent and the level of trust Secondwe will consider the context defined by thecharacteristics of the individual organizationand culture Figure 1 shows this as a collectionof factors at the top of the diagram each affect-ing a different element of the belief-attitude-intention-behavior sequence associated withreliance on an agent Third we will describe thebasis of trust This defines the different types ofinformation needed to maintain an appropriatelevel of trust which the bottom of Figure 1shows in terms of the purpose process andperformance dimensions that describe the goal-oriented characteristics of the agent Finallyregarding the cognitive process governingtrust trust depends on the interplay among theanalytic analogical and affective processes Ineach of these sections we will cite literatureregarding the organizational sociological inter-personal psychological and neurological per-spectives of human-human trust and then drawparallels with human-automation trust Theunderstanding of trust in automation can bene-fit from a multidisciplinary consideration ofhow context agent characteristics and cognitiveprocesses affect the appropriateness of trust

Appropriate Trust CalibrationResolution and Specificity

Inappropriate reliance associated with mis-use and disuse depends in part on how welltrust matches the true capabilities of the auto-mation Supporting appropriate trust is criticalin avoiding misuse and disuse of automationjust as it is in facilitating effective interpersonalrelationships (Wicks Berman amp Jones 1999)

Calibration resolution and specificity of trustdescribe mismatches between trust and the capa-bilities of automation Calibration refers to thecorrespondence between a personrsquos trust inthe automation and the automationrsquos capabili-ties (Lee amp Moray 1994 Muir 1987) The defi-nitions of appropriate calibration of trust parallelthose of misuse and disuse in describing appro-priate reliance Overtrust is poor calibration inwhich trust exceeds system capabilities withdistrust trust falls short of the automationrsquos ca-pabilities In Figure 2 good calibration is rep-resented by the diagonal line where the levelof trust matches automation capabilities Abovethis line is overtrust and below it is distrust

Resolution refers to how precisely a judg-ment of trust differentiates levels of automation

Automation capability (trustworthiness)

Trust

Overtrust Trust exceeds system capabilities leading to misuse

Distrust Trust falls short of system capabilities leading to disuse

Calibrated trust Trust matches system capabilities leading to appropriate use

Poor resolution A large range of system capability maps onto a small range of trust

Good resolution A range of system capability maps onto the same range of trust

Figure 2 The relationship among calibration resolution and automation capability in defining appropriatetrust in automation Overtrust may lead to misuse and distrust may lead to disuse

56 Spring 2004 ndash Human Factors

capability (Cohen et al 1999) Figure 2 showsthat poor resolution occurs when a large rangeof automation capability maps onto a smallrange of trust With low resolution large chang-es in automation capability are reflected in smallchanges in trust Specificity refers to the degreeto which trust is associated with a particularcomponent or aspect of the trustee Functionalspecificity describes the differentiation of func-tions subfunctions and modes of automationWith high functional specificity a personrsquos trustreflects capabilities of specific subfunctionsand modes Low functional specificity meansthe personrsquos trust reflects the capabilities of theentire system

Specificity can also describe changes in trustas a function of the situation or over time Hightemporal specificity means that a personrsquos trustreflects moment-to-moment fluctuations inautomation capability whereas low temporalspecificity means that the trust reflects onlylong-term changes in automation capabilityAlthough temporal specificity implies a genericchange over time as the personrsquos trust adjuststo failures with the automation temporal speci-ficity also addresses adjustments that shouldoccur when the situation or context changesand affects the capability of the automationTemporal specificity reflects the sensitivity oftrust to changes in context that affect automa-tion capability High functional and temporalspecificity increase the likelihood that the levelof trust will match the capabilities of a particu-lar element of the automation at a particulartime Good calibration high resolution andhigh specificity of trust can mitigate misuse anddisuse of automation and so they can guide de-sign evaluation and training to enhance human-automation partnerships

Individual Organizational and CulturalContext

Trust does not develop in a vacuum but in-stead evolves in a complex individual culturaland organizational context The individual con-text includes individual differences such as thepropensity to trust These differences influencethe initial level of trust and influence how newinformation is interpreted The individual con-text also includes a personrsquos specific history of

interactions that have led to a particular level oftrust The organizational context can also havea strong influence on trust The organizationalcontext reflects the interactions between peoplethat inform them about the trustworthiness ofothers which can include reputation and gossipThe cultural context also influences trust throughsocial norms and expectations Understandingtrust requires a careful consideration of the indi-vidual organizational and cultural context

Systematic individual differences influencetrust between people Some people are moreinclined to trust than are others (Gaines et al1997 Stack 1978) Rotter (1967) defined trustas an enduring personality trait This concep-tion of trust follows a social learning theory ap-proach in which expectations for a particularsituation are determined by specific previousexperiences with situations that are perceivedto be similar (Rotter 1971) People develop be-liefs about others that are generalized and ex-trapolated from one interaction to another Inthis context trust is a generalized expectancythat is independent of specific experiences andbased on the generalization of a large numberof diverse experiences Individual differencesregarding trust have important implications forthe study of human-automation trust becausethey may influence reliance in ways that arenot directly related to the characteristics of theautomation

Substantial evidence demonstrates that thetendency to trust considered as a personalitytrait can be reliably measured and can influencebehavior in a systematic manner For exampleRotterrsquos (1980) Interpersonal Trust Scale reli-ably differentiates people on their propensity totrust others with an internal consistency of 76and a test-retest reliability of 56 with 7 monthsbetween tests Interestingly high-trust individ-uals are not more gullible than low-trust indi-viduals (Gurtman 1992) and the propensity totrust is not correlated with measures of intellect(Rotter 1980) however high-trust individualsare viewed as more trustworthy by others anddisplay more truthful behavior (Rotter1971) Infact people with a high propensity to trust pre-dicted othersrsquo trustworthiness better than thosewith a low propensity to trust (Kikuchi Wanta-nabe amp Yamasishi 1996) Likewise low- andhigh-trust individuals respond differently to

TRUST IN AUTOMATION 57

feedback regarding collaboratorsrsquo intentions andto situational risk (Kramer 1999)

These findings may explain why individualdifferences in the general tendency to trust auto-mation as measured by a complacency scale(Parasuraman Singh Molloy amp Parasuraman1992) are not clearly related to misuse of auto-mation For example Singh Molloy and Para-suraman (1993) found that high-complacencyindividuals actually detected more automationfailures in a constant reliability condition (534compared with 187 for low-complacencyindividuals) This unexpected result is similarto findings in studies of human trust in whichhighly trusting individuals were found to trustmore appropriately Several studies of trust inautomation show that for some people trustchanges substantially as the capability of theautomation changes and for other people trustchanges relatively little (Lee amp Moray1994 Ma-salonis 2000) One possible explanation thatmerits further investigation is that high-trustindividuals may be better able to adjust theirtrust to situations in which the automation ishighly capable as well as to situations in whichit is not

In contrast to the description of trust as astable personality trait most researchers havefocused on trust as an attitude (Jones amp George1998) In describing interpersonal relationshipstrust has been considered as a dynamic attitudethat evolves along with the developing rela-tionship (Rempel et al 1985) The influenceof individual differences regarding the predis-position to trust is most important when a situa-tion is ambiguous and generalized expectanciesdominate and it becomes less important as therelationship progresses (McKnight Cummingsamp Chervany 1998) Trust as an attitude is ahistory-dependent variable that depends on theprior behavior of the trusted person and the in-formation that is shared (Deutsch 1958) Theinitial level of trust is determined by past expe-riences in similar situations some of these ex-periences may even be indirect as in the caseof gossip

The organizational context influences howreputation gossip and formal and informalroles affect the trust of people who have neverhad any direct contact with the trustee (Burt ampKnez 1996 Kramer 1999) For example engi-

neers are trusted not because of the ability ofany specific person but because of the underly-ing education and regulatory structure that gov-erns people in the role of an engineer (Kramer1999) The degree to which the organizationalcontext affects the indirect development of trustdepends on the strength of links in the socialnetwork and on the trustorrsquos ability to establishlinks between the situations experienced by oth-ers and the situations he or she is confronting(Doney et al 1998) These findings suggestthat the organizational context and the indirectexposure that it facilitates can have an impor-tant influence on trust and reliance

Beyond the individual and organizationalcontext culture is another element of contextthat can influence trust and its development(Baba Falkenburg amp Hill 1996) Culture canbe defined as a set of social norms and expec-tations that reflect shared educational and lifeexperiences associated with national differencesor distinct cohorts of workers In particularJapanese citizens have been found to have a gen-erally low level of trust (Yamagishi amp Yama-gishi 1994) In Japan networks of mutuallycommitted relations play a more important rolein governing exchange relationships than theydo in the United States A cross-national ques-tionnaire that included 1136 Japanese and 501American respondents showed that the Ameri-can respondents were more trusting of otherpeople in general and considered reputation moreimportant In contrast the Japanese respondentsplaced a higher value on exchange relationshipsthat are based on personal relationships Oneexplanation for this is that the extreme socialstability of mutually committed relationships in Japan reduces uncertainty about transactionsand diminishes the role of trust (Doney et al1998)

More generally cultural differences associat-ed with power distance (eg dependence onauthority and respect for authoritarian norms)uncertainty avoidance and individualist andcollectivist attitudes can influence the develop-ment and role of trust (Doney et al 1998) Cul-tural variation can influence trust in an on-lineenvironment Karvonen (2001) compared trustin E-commerce service among consumers fromFinland Sweden and Iceland Although alltrusted a simple design there were substantial

58 Spring 2004 ndash Human Factors

differences in the initial trust in a complexdesign with Finnish customers being the mostwary and Icelandic customers the most trustingMore generally the level of social trust variessubstantially among countries and this variationaccounts for over 64 of the variance in the lev-el of Internet adoption (Huang Keser Lelandamp Shachat 2002) The culturally dependentnature of trust suggests that findings regardingtrust in automation may need to be verifiedwhen they are extrapolated from one culturalsetting to another

Cultural context can also describe systematicdifferences in groups of workers For exam-ple in the context of trust in automation Riley(1996) found that pilots who are accustomedto using automation trusted and relied on auto-mation more than did students who were notas familiar with automation Likewise in fieldstudies of operators adapting to computerizedmanufacturing systems Zuboff (1988) foundthat the culture associated with those operatorswho had been exposed to computer systems ledto greater trust and acceptance of automationThese results show that trust in automation liketrust in people is culturally influenced in thatit depends on the long-term experiences of agroup of people

Trust between people depends on the individ-ual organizational and cultural context Thiscontext influences trust because it affects initiallevels of trust and how people interpret infor-mation regarding the agent Some evidenceshows that these relationships may also hold fortrust in automation however very little researchhas investigated any of these factors We ad-dress the types of information that form thebasis of trust in the next section

Basis of Trust Levels of Detail and ofAttributional Abstraction

Trust is a multidimensional construct that isbased on trustee characteristics such as motivesintentions and actions (Bhattacharya Devinneyamp Pillutla 1998 Mayer et al 1995) The basisfor trust is the information that informs the per-son about the ability of the trustee to achievethe goals of the trustor Two critical elementsdefine the basis of trust The first is the focusof trust What is to be trusted The second isthe type of information that describes the entity

to be trusted What is the information support-ing trust This information guides expectationsregarding how well the trustee can achieve thetrustorrsquos goals The focus of trust can be de-scribed according to levels of detail and theinformation supporting trust can be describedaccording to three levels of attributional ab-straction

The focus of trust can be considered along adimension defined by the level of detail whichvaries from general trust of institutions andorganizations to trust in a specific agent Withautomation this might correspond to trust in anoverall system of automation as compared withtrust in a particular mode of an automatic con-troller Couch and Jones (1997) considered threelevels of trust ndash generalized network and part-ner trust ndash which are similar to the three levelsidentified by McKnight et al (1998) Generaltrust corresponds to trust in human nature andinstitutions (Barber 1983) Network trust refersto the cluster of acquaintances that constitutesa personrsquos social network whereas partner trustcorresponds to trust in specific persons (Rempelet al 1985)

There does not seem to be a reliable rela-tionship between global and specific trust sug-gesting that this dimension makes importantdistinctions regarding the evolution of trust(Couch amp Jones 1997) For example trust in asupervisor and trust in an organization are dis-tinct and depend on qualitatively different fac-tors (Tan amp Tan 2000) Trust in a supervisordepends on perceived ability integrity and be-nevolence whereas trust in the organizationdepends on more distal variables of perceivedorganizational support and justice General trustis frequently considered to be a personality traitwhereas specific trust is frequently viewed as anoutcome of a specific interaction (Couch ampJones 1997) Several studies show howeverthat general and specific trust are not necessarilytied to the ldquotrait and staterdquo distinctions and mayinstead simply refer to different levels of detail(Sitkin amp Roth 1993) Developing a high degreeof functional specificity in the trust of automa-tion may require specific information for eachlevel of detail of the automation

The basis of trust can be considered along adimension of attributional abstraction whichvaries from demonstrations of competence to

TRUST IN AUTOMATION 59

the intentions of the agent A recent review oftrust literature concluded that three general lev-els summarize the bases of trust ability integrityand benevolence (Mayer et al 1995) Ability isthe group of skills competencies and character-istics that enable the trustee to influence the do-main Integrity is the degree to which the trusteeadheres to a set of principles the trustor findsacceptable Benevolence is the extent to whichthe intents and motivations of the trustee arealigned with those of the trustor

In the context of interpersonal relationshipsRempel et al (1985) viewed trust as an evolvingphenomenon with the basis of trust changing asthe relationship progressed They argued thatpredictability the degree to which future be-havior can be anticipated (and which is similarto ability) forms the basis of trust early in arelationship This is followed by dependabilitywhich is the degree to which behavior is con-sistent and is similar to integrity As the rela-tionship matures the basis of trust ultimatelyshifts to faith which is a more general judgmentthat a person can be relied upon and is similarto benevolence A similar progression emergedin a study of operatorsrsquo adaptation to new tech-nology (Zuboff 1988) Trust in that context de-pended on trial-and-error experience followedby understanding of the technologyrsquos operationand finally faith Lee and Moray (1992) madesimilar distinctions in defining the factors thatinfluence trust in automation They identifiedperformance process and purpose as the gen-eral bases of trust

Performance refers to the current and histor-ical operation of the automation and includescharacteristics such as reliability predictabilityand ability Performance information describeswhat the automation does More specificallyperformance refers to the competency or exper-tise as demonstrated by its ability to achievethe operatorrsquos goals Because performance islinked to the ability to achieve specific goals itdemonstrates the task- and situation-dependentnature of trust This is similar to Sheridanrsquos(1992) concept of robustness as a basis for trustin human-automation relationships The operatorwill tend to trust automation that performs in amanner that reliably achieves his or her goals

Process is the degree to which the automa-tionrsquos algorithms are appropriate for the situation

and able to achieve the operatorrsquos goals Pro-cess information describes how the automationoperates In interpersonal relationships thiscorresponds to the consistency of actions associ-ated with adherence to a set of acceptable prin-ciples (Mayer et al 1995) Process as a basisfor trust reflects a shift away from focus on spe-cific behaviors and toward qualities and char-acteristics attributed to an agent With processtrust is in the agent and not in the specificactions of the agent Because of this the processbasis of trust relies on dispositional attributionsand inferences drawn from the performance ofthe agent In this sense process is similar todependability and integrity Openness the will-ingness to give and receive ideas is anotherelement of the process basis of trust Interesting-ly consistency and openness are more importantfor trust in peers than for trust in supervisors orsubordinates (Schindler amp Thomas 1993)

In the context of automation the processbasis of trust refers to the algorithms and oper-ations that govern the behavior of the auto-mation This concept is similar to Sheridanrsquos(1992) concept of understandability in human-automation relationships The operator willtend to trust the automation if its algorithmscan be understood and seem capable of achiev-ing the operatorrsquos goals in the current situation

Purpose refers to the degree to which theautomation is being used within the realm ofthe designerrsquos intent Purpose describes why theautomation was developed Purpose correspondsto faith and benevolence and reflects the percep-tion that the trustee has a positive orientationtoward the trustor With interpersonal relation-ships the perception of such a positive orienta-tion depends on the intentions and motives ofthe trustee This can take the form of abstractgeneralized value congruence (Sitkin amp Roth1993) which can be described as whether andto what extent the trustee has a motive to lie(Hovland Janis amp Kelly 1953) The purposebasis of trust reflects the attribution of thesecharacteristics to the automation Frequentlywhether or not this attribution takes place willdepend on whether the designerrsquos intent has beencommunicated to the operator If so the opera-tor will tend to trust automation to achieve thegoals it was designed to achieve

Table 1 builds on the work of Mayer et al

60 Spring 2004 ndash Human Factors

TABLE 1 Summary of the Dimensions That Describe the Basis of Trust

Study Basis of Trust Summary Dimension

Barber (1983) Competence PerformancePersistence ProcessFiduciary responsibility Purpose

Butler amp Cantrell (1984) Competence PerformanceIntegrity ProcessConsistency ProcessLoyalty PurposeOpenness Process

Cook amp Wall (1980) Ability PerformanceIntentions Purpose

Deutsch (1960) Ability PerformanceIntentions Purpose

Gabarro (1978) Integrity ProcessMotives PurposeOpenness ProcessDiscreetness ProcessFunctionalspecific competence PerformanceInterpersonal competence PerformanceBusiness sense PerformanceJudgment Performance

Hovland Janis amp Kelly (1953) Expertise PerformanceMotivation to lie Purpose

Jennings (1967) Loyalty PurposePredictability ProcessAccessibility ProcessAvailability Process

Kee amp Knox (1970) Competence PerformanceMotives Purpose

Mayer et al (1995) Ability PerformanceIntegrity ProcessBenevolence Purpose

Mishra (1996) Competency PerformanceReliability PerformanceOpenness ProcessConcern Purpose

Moorman et al (1993) Integrity ProcessWillingness to reduce uncertainty ProcessConfidentiality ProcessExpertise PerformanceTactfulness ProcessSincerity ProcessCongeniality PerformanceTimeliness Performance

Rempel et al (1985) Reliability PerformanceDependability ProcessFaith Purpose

Sitkin amp Roth (1993) Context-specific reliability PerformanceGeneralized value congruence Purpose

Zuboff (1988) Trial and error experience PerformanceUnderstanding ProcessLeap of faith Purpose

TRUST IN AUTOMATION 61

(1995) and shows how the many characteristicsof a trustee that affect trust can be summarizedby the three bases of performance process andpurpose As an example Mishra (1996) com-bined a literature review and interviews with33 managers to identify four dimensions oftrust These dimensions can be linked to thethree bases of trust competency (performance)reliability (performance) openness (process) andconcern (purpose) The dimensions of purposeprocess and performance provide a concise setof distinctions that describe the basis of trustacross a wide range of application domainsThese dimensions clearly identify three typesof goal-oriented information that contribute todeveloping an appropriate level of trust

Trust depends not only on the observationsassociated with these three dimensions but alsoon the inferences drawn among them Observa-tion of performance can support inferences re-garding internal mechanisms associated withthe dimension of process analysis of internalmechanisms can support inferences regardingthe designerrsquos intent associated with the dimen-sion of purpose Likewise the underlying pro-cess can be inferred from knowledge of thedesignerrsquos intent and performance can be esti-mated from an understanding of the underlyingprocess If these inferences support trust it fol-lows that a system design that facilitates themwould promote a more appropriate level of trustIf inferences are inconsistent with observationsthen trust will probably suffer because of apoor correspondence with the observed agentSimilarly if inferences between levels are incon-sistent trust will suffer because of poor coher-ence For example inconsistency between theintentions conveyed by the manager (purposebasis for trust) and the managerrsquos actions (perfor-mance basis of trust) have a particularly negativeeffect on trust (Gabarro 1978) Likewise the ef-fort to demonstrate reliability and encouragetrust by enforcing procedural approaches maynot succeed if value-oriented concerns (egpurpose) are ignored (Sitkin amp Roth 1993)

In conditions where trust is based on severalfactors it can be quite stable or robust in situ-ations where trust depends on a single basis itcan be quite volatile or fragile (McKnight et al1998) Trust that is based on an understandingof the motives of the agent will be less fragile

than trust based only on the reliability of theagentrsquos performance (Rempel et al 1985)These findings help explain why trust in auto-mation can sometimes appear quite robust andat other times quite fragile (Kantowitz Hanow-ski amp Kantowitz 1997b) Both imperfect co-herence between dimensions and imperfectcorrespondence with observations are likely toundermine trust and are critical considerationsfor encouraging appropriate levels of trust

The availability of information at the differ-ent levels of detail and attributional abstractionmay lead to high calibration high resolutionand great temporal and functional specificity oftrust Designing interfaces and training to pro-vide operators with information regarding thepurpose process and performance of automa-tion could enhance the appropriateness of trustHowever the mere availability of informationwill not ensure appropriate trust The informa-tion must be presented in a manner that is con-sistent with the cognitive processes underlyingthe development of trust as described in thefollowing section

Analytic Analogical and AffectiveProcesses Governing Trust

Information that forms the basis of trust canbe assimilated in three qualitatively differentways Some argue that trust is based on an ana-lytic process others argue that trust dependson an analogical process of assessing categorymembership Most importantly trust also seemsto depend on an affective response to the vio-lation or confirmation of implicit expectan-cies Figure 3 shows how these processes mayinteract to influence trust Ultimately trust is anaffective response but it is also influenced byanalytic and analogical processes The bold linesin Figure 3 show that the affective process hasa greater influence on the analytic process thanthe analytic has on the affective (LoewensteinWeber Hsee amp Welch 2001) The roles of ana-lytic analogical and affective processes dependon the evolution of the relationship betweenthe trustor and trustee the information avail-able to the trustor and the way the informationis displayed

A dominant approach to trust in the organi-zational sciences considers trust from a rationalchoice perspective (Hardin 1992) This view

62 Spring 2004 ndash Human Factors

suggests that trust depends on an evaluationmade under uncertainty in which decision mak-ers use their knowledge of the motivations andinterests of the other party to maximize gainsand minimize losses Individuals are assumed tomake choices based on a rationally derived as-sessment of costs and benefits (Lewicki amp Bun-ker 1996) Williamson (1993) argued that trustis primarily analytic in business and commercialrelations and that it represents one aspect ofrational risk analysis Trust accumulates and dis-sipates based on the effect of cumulative inter-actions with the other agent These interactionslead to an accumulation of knowledge that es-tablishes trust through an expected value cal-culation in which the probabilities of variousoutcomes are estimated with increasing experi-ence (Holmes 1991) Trust can also depend onan analysis of how organizational and individualconstraints affect performance For exampletrust can be considered in terms of a rationalargument in which grounds for various conclu-sions are developed and defended according tothe evidence (Cohen 2000)

These approaches provide a useful evaluationof the normative level of trust but they may notdescribe how trust actually develops Analyticapproaches to trust probably overstate cognitivecapacities and understate the influence of affectand role of strategies to accommodate the lim-its of bounded rationality similar to normative

decision-making theories (Janis amp Mann 1977Simon 1957) Descriptive accounts of decisionmaking show that expert decision makers sel-dom engage in conscious calculations or in anexhaustive comparison of alternatives (Cross-man 1974 Klein 1989) The analytic processis similar to knowledge-based performance asdescribed by Rasmussen (1983) in which infor-mation is processed and plans are formulatedand evaluated using a function-based mentalmodel of the system Knowledge-based or ana-lytic processes are probably complemented byless cognitively demanding processes

A less cognitively demanding process is fortrust to develop according to analogical judg-ments that link levels of trust to characteristicsof the agent and environmental context Thesecategories develop through direct observationof the trusted agent intermediaries that conveytheir observations and presumptions based onstandards category membership and procedures(Kramer 1999)

Explicit and implicit rules frequently guideindividual and collective collaborative behavior(Mishra 1996) Sitkin and Roth (1993) illus-trated the importance of rules in their distinctionbetween trust as an institutional arrangementand interpersonal trust Institutional trust de-pends on contracts legal procedures and for-mal rules whereas interpersonal trust is basedon shared values The application of rules in

Figure 3 The interplay among the analytic analogical and affective process underlying trust with the analyticand analogical processes both contributing to the affective process and the affective process also having adominant influence on the analytic and analogical processes

TRUST IN AUTOMATION 63

institution-based trust reflects guarantees andsafety nets based on the organizational con-straints of regulations and legal recourses thatconstrain behavior and make it more possibleto trust (McKnight et al 1998) However whenfundamental values which frequently supportinterpersonal trust are violated legalistic ap-proaches and applications of rules to restoretrust are ineffective and can further underminetrust (Sitkin amp Roth 1993) When rules areconsistent with trustworthy behavior they canincrease peoplersquos expectation of satisfactoryperformance during times of normal operationby pairing situations with rule-based expecta-tions but rules can have a negative effect whensituations diverge from normal operation

Analogical trust can also depend on interme-diaries who can convey information to supportjudgments of trust (Burt amp Knez 1996) suchas reputations and gossip that enable individualsto develop trust without any direct contact Inthe context of trust in automation responsetimes to warnings tend to increase when falsealarms occur This effect was counteracted bygossip that suggested that the rate of false alarmswas lower than it actually was (Bliss Dunn ampFuller 1995) Similarly trust can be based onpresumptions of the trustworthiness of a cate-gory or role of the person rather than on theactual performance (Kramer Brewer amp Hanna1996) If trust is primarily based on rules thatcharacterize performance during normal situa-tions then abnormal situations or erroneouscategory membership might lead to the collapseof trust because the assurances associated withnormal or customary operation are no longervalid Because of this category-based trust canbe fragile particularly when abnormal situationsdisrupt otherwise stable roles For example whenroles were defined by a binding contract trust de-clined substantially when the contracts wereremoved (Malhotra amp Murnighan 2002)

Similar effects may occur with trust in auto-mation Specifically Miller (2002) suggested thatcomputer etiquette may have an important influ-ence on human-computer interaction Etiquettemay influence trust because category member-ship associated with adherence to a particularetiquette helps people to infer how the automa-tion will perform Intentionally developing com-puter etiquette could promote appropriate trust

but it could lead to inappropriate trust if peopleinfer inappropriate category memberships anddevelop distorted expectations regarding thecapability of the automation

These studies suggest that trust in automa-tion may depend on category judgments basedon a variety of sources that include direct andindirect interaction with the automation Be-cause these analogical judgments emerge fromcomplex interactions between people proce-dures and prolonged experience data from lab-oratory experiments may need to be carefullyassessed to determine how well they generalizeto actual operational settings In this way theanalogical process governing trust correspondsvery closely to Rasmussenrsquos (1983) concept ofrule-based behavior in which condition-actionpairings or procedures guide behavior

Analytic and analogical processes do not ac-count for the affective aspects of trust whichrepresent the core influence of trust on behav-ior (Kramer 1999) Emotional responses arecritical because people not only think abouttrust they also feel it (Fine amp Holyfield 1996)Emotions fluctuate over time according to theperformance of the trustee and they can signalinstances where expectations do not conform tothe ongoing experience As trust is betrayedemotions provide signals concerning the chang-ing nature of the situation (Jones amp George1998)

Berscheid (1994) used the concept of auto-matic processing to describe how behaviors thatare relevant to frequently accessed trait con-structs such as trust are likely to be perceivedunder attentional overload situations Becauseof this people process observations of new be-havior according to old constructs withoutalways being aware of doing so Automaticprocessing plays a substantial role in attribu-tional activities with many aspects of causalreasoning occurring outside conscious aware-ness In this way emotions bridge the gaps inrationality and enable people to focus theirlimited attentional capacity (Johnson-Laird ampOately 1992 Loewenstein et al 2001 Simon1967) Because the cognitive complexity ofrelationships can exceed a personrsquos capacity toform a complete mental model people cannotperfectly predict behavior and emotions serveto redistribute cognitive resources and manage

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 2: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 51

disuse and misuse of automation is a criticallyimportant problem with broad ramifications

Recent research suggests that misuse and dis-use of automation may depend on certain feel-ings and attitudes of users such as trust This isparticularly important as automation becomesmore complex and goes beyond a simple toolwith clearly defined and easily understood be-haviors In particular many studies show thathumans respond socially to technology and re-actions to computers can be similar to reactionsto human collaborators (Reeves amp Nass 1996)For example the similarity-attraction hypothesisin social psychology predicts that people withsimilar personality characteristics will be attract-ed to each other (Nass amp Lee 2001)

This finding also predicts user acceptance ofsoftware (Nass amp Lee 2001 Nass Moon FoggReeves amp Dryer 1995) Software that displayspersonality characteristics similar to those ofthe user tends to be more readily accepted Forexample computers that use phrases such asldquoYou should definitely do thisrdquo will tend to ap-peal to dominant users whereas computers thatuse less directive language such as ldquoPerhapsyou should do thisrdquo tend to appeal to submis-sive users (Nass amp Lee) Similarly the conceptof affective computing suggests that computersthat can sense and respond to usersrsquo emotionalstates may greatly improve human-computerinteraction (Picard 1997) More recently theconcept of computer etiquette suggests thathuman-computer interactions can be enhancedby recognizing how the social and work contextsinteract with the roles of the computer andhuman to specify acceptable behavior (Miller2002) More generally designs that consideraffect are likely to enhance productivity andacceptance (Norman Ortony amp Russell 2003)Together this research suggests that the emotion-al and attitudinal factors that influence human-human relationships may also contribute tohuman-automation relationships

Trust a social psychological concept seemsparticularly important for understanding human-automation partnerships Trust can be definedas the attitude that an agent will help achieve anindividualrsquos goals in a situation characterizedby uncertainty and vulnerability In this defini-tion an agent can be automation or anotherperson that actively interacts with the environ-

ment on behalf of the person Considerable re-search has shown the attitude of trust to be im-portant in mediating how people rely on eachother (Deutsch 1958 1960 Rempel Holmesamp Zanna 1985 Ross amp LaCroix 1996 Rotter1967) Sheridan (1975) and Sheridan and Hen-nessy (1984) argued that just as trust mediatesrelationships between people it may also medi-ate the relationship between people and auto-mation Many studies have demonstrated thattrust is a meaningful concept to describe human-automation interaction in both naturalistic(Zuboff 1988) and laboratory settings (HalprinJohnson amp Thornburry 1973 Lee amp Moray1992 Lewandowsky Mundy amp Tan 2000 Muir1989 Muir amp Moray 1996) These observationsdemonstrate that trust is an attitude toward au-tomation that affects reliance and that it can bemeasured consistently People tend to rely onautomation they trust and tend to reject auto-mation they do not By guiding reliance trusthelps to overcome the cognitive complexity peo-ple face in managing increasingly sophisticatedautomation

Trust guides ndash but does not completely deter-mine ndash reliance and the recent surge in researchrelated to trust and reliance has produced manyconfusing and seemingly conflicting findingsAlthough many recent articles have describedthe role of trust in mediating reliance on auto-mation there has been no integrative review ofthese studies The purpose of this paper is toprovide such a review link trust in automationto the burgeoning research on trust in otherdomains and resolve conflicting findings Webegin by developing a conceptual model to linkorganizational sociological interpersonal psy-chological and neurological perspectives ontrust between people to human-automation trustWe then use this conceptual model of trust andreliance to integrate research related to human-automation trust The conceptual model identi-fies important research issues and it also identifiesdesign evaluation and training approaches topromote appropriate trust and reliance

TRUST AND AFFECT

Researchers from a broad range of disci-plines have examined the role of trust in mediat-ing relationships between individuals between

52 Spring 2004 ndash Human Factors

individuals and organizations and even betweenorganizations Specifically trust has been inves-tigated as a critical factor in interpersonal rela-tionships where the focus is often on romanticrelationships (Rempel et al 1985) In exchangerelationships another important research areathe focus is on trust between management andemployees or between supervisors and subor-dinates (Tan amp Tan 2000) Trust has also beenidentified as a critical factor in increasing orga-nizational productivity and strengthening or-ganizational commitment (Nyhan 2000) Trustbetween firms and customers has become animportant consideration in the context of rela-tionship management (Morgan amp Hunt 1994)and Internet commerce (Muller 1996) Research-ers have even considered the issue of trust in thecontext of the relationship between organiza-tions such as those in multinational firms (Ringamp Vandeven 1992) in which cross-disciplinaryand cross-cultural collaboration is critical (DoneyCannon amp Mullen 1998)

Interest in trust has grown dramatically inthe last 5 years as many have come to recognizeits importance in promoting efficient transac-tions and cooperation Trust has emerged as acentral focus of organizational theory (Kramer1999) has been the focus of recent special issuesof the Academy of Management Review (Jonesamp George 1998) and the International Journalof Human-Computer Studies (Corritore Kracheramp Wiedenbeck 2003b) was the topic of a work-shop at the CHI 2001 meeting (Corritore Kra-cher amp Wiedenbeck 2001a) and has been thetopic of books such as that by Kramer and Tyler(1996)

The general theme of the increasing cogni-tive complexity of automation organizationsand interpersonal interactions explains therecent interest in trust Trust tends to be lessimportant in well-structured stable environ-ments such as procedure-based hierarchicalorganizations in which an emphasis on orderand stability minimize transactional uncertain-ty (Moorman Deshpande amp Zaltman 1993)Many organizations however have recentlyadopted agile structures self-directed workgroups matrix structures and complex automa-tion all of which make the workplace increas-ingly complex unstable and uncertain Becausethese changes enable rapid adaptation to change

and accommodate unanticipated variabilitythere is a trend away from well-structuredprocedure-based environments Although thesechanges have the potential to make organiza-tions and individuals more productive and ableto adapt to the unanticipated (Vicente 1999)they also increase cognitive complexity andleave more degrees of freedom for the individualto resolve Trust plays a critical role in peoplersquosability to accommodate the cognitive complexityand uncertainty that accompanies the moveaway from highly structured organizations andsimple technology

Trust helps people to accommodate com-plexity in several ways It supplants supervisionwhen direct observation becomes impracticaland it facilitates choice under uncertainty byacting as a social decision heuristic (Kramer1999) It also reduces uncertainty in gauging theresponses of others thereby guiding appropriatereliance and generating a collaborative advan-tage (Baba 1999 Ostrom 1998) Moreovertrust facilitates decentralization and adaptivebehavior by making it possible to replace fixedprotocols reporting structures and procedureswith goal-related expectations regarding thecapabilities of others The increased complexityand uncertainty that has inspired the recent in-terest in trust in other fields parallels theincreased complexity and sophistication ofautomation Trust in automation guides reliancewhen the complexity of the automation makesa complete understanding impractical and whenthe situation demands adaptive behavior thatprocedures cannot guide For this reason therecent interest in trust in other disciplines pro-vides a rich and appropriate theoretical basefor understanding how trust mediates relianceon complex automation and more generallyhow it affects computer-mediated collaborationthat involves both human and computer agents

Definition of Trust Beliefs AttitudesIntentions and Behavior

Not surprisingly the diverse interest in trusthas generated many definitions This is particu-larly true when considering how trust relates toautomation (Cohen Parasuraman amp Freeman1999 Muir1994) By examining the differencesand common themes of these definitions it is

TRUST IN AUTOMATION 53

possible to identify critical considerations forunderstanding the role of trust in mediatinghuman-automation interaction Some research-ers focus on trust as an attitude or expectationand they tend to define trust in one of the follow-ing ways ldquoexpectancy held by an individual thatthe word promise or written communicationof another can be relied uponrdquo (Rotter 1967p 651) ldquoexpectation related to subjective proba-bility an individual assigns to the occurrence ofsome set of future eventsrdquo (Rempel et al 1985p 96) ldquoexpectation of technically competentrole performancerdquo (Barber 1983 p 14) or ldquoex-pectations of fiduciary obligation and responsi-bility that is the expectation that some othersin our social relationships have moral obligationsand responsibility to demonstrate a special con-cern for othersrsquo interests above their ownrdquo (Bar-ber p 14) These definitions all include theelement of expectation regarding behaviors oroutcomes Clearly trust concerns an expectan-cy or an attitude regarding the likelihood offavorable responses

Another common approach characterizestrust as an intention or willingness to act Thisgoes beyond attitude in that trust is character-ized as an intention to behave in a certain man-ner or to enter into a state of vulnerability Forexample trust has been defined as ldquowillingnessto place oneself in a relationship that establishesor increases vulnerability with the reliance uponsomeone or something to perform as expectedrdquo(Johns 1996 p 81) ldquowillingness to rely on anexchange partner in whom one has confidencerdquo(Moorman et al 1993 p 82) and ldquowillingnessof a party to be vulnerable to the actions ofanother party based on the expectation that theother will perform a particular action impor-tant to the trustor irrespective of the ability tomonitor or control that partyrdquo (Mayer Davisamp Schoorman 1995 p 712)

The definition by Mayer et al (1995) is themost widely used and accepted definition oftrust (Rousseau Sitkin Burt amp Camerer 1998)As of April 2003 the Institute for Scientific In-formation citation database showed 203 cita-tions of this article far more than others on thetopic of trust The definition identifies vulnera-bility as a critical element of trust For trust tobe an important part of a relationship individ-uals must willingly put themselves at risk or in

vulnerable positions by delegating responsibili-ty for actions to another party

Some authors go beyond intention and definetrust as a behavioral result or state of vulnerabil-ity or risk (Deutsch 1960 Meyer 2001) Ac-cording to these definitions trust is the outcomeof actions that place people into certain states orsituations It can be seen as for example ldquoa stateof perceived vulnerability or risk that is derivedfrom an individualrsquos uncertainty regarding themotives intentions and perspective actions ofothers on whom they dependrdquo (Kramer 1999p 571)

These definitions highlight some importantinconsistencies regarding whether trust is abelief attitude intention or behavior Thesedistinctions are of great theoretical importanceas multiple factors mediate the process of trans-lating beliefs and attitudes into behaviors

Ajzen and Fishbein (1980 Fishbein amp Ajzen1975) developed a framework that can helpreconcile these conflicting definitions of trustTheir framework shows that behaviors resultfrom intentions and that intentions are a func-tion of attitudes Attitudes in turn are based onbeliefs According to this framework beliefsand perceptions represent the information basethat determines attitudes The availability ofinformation and the personrsquos experiences influ-ence beliefs An attitude is an affective evalua-tion of beliefs that guides people to adopt aparticular intention Intentions then translateinto behavior according to the environmentaland cognitive constraints a person faces In thecontext of trust and reliance trust is an attitudeand reliance is a behavior This framework keepsbeliefs attitudes intentions and behavior con-ceptually distinct and can help explain the in-fluence of trust on reliance According to thisframework trust affects reliance as an attituderather than as a belief intention or behaviorBeliefs underlie trust and various intentionsand behaviors may result from different levelsof trust

Considering trust as an intention or behaviorhas the potential to confuse its effect with theeffects of other factors that can influence behav-ior such as workload situation awareness andself-confidence of the operator (Lee amp Moray1994 Riley 1994) Trust is not the only factormediating the relationship between beliefs and

54 Spring 2004 ndash Human Factors

behavior Other psychological system and en-vironmental constraints intervene such as whenoperators do not have enough time to engagethe automation even though they trust it andintend to use it or when the effort to engagethe automation outweighs its benefits (Kirlik1993) Trust stands between beliefs about thecharacteristics of the automation and the inten-tion to rely on the automation

Many definitions of trust also indicate theimportance of the goal-oriented nature of trustAlthough many definitions do not identify thisaspect of trust explicitly several mention theability of the trustee to perform an important ac-tion and the expectation of the trustor that thetrustee will perform as expected or can be reliedupon (Gurtman 1992 Johns 1996 Mayer etal 1995) These definitions describe the basisof trust in terms of the performance of an agentthe trustee who furthers the goals of an indi-vidual the trustor In this way trust describes a

relationship that depends on the characteristicsof the trustee the trustor and the goal-relatedcontext of the interaction Trust is not a consid-eration in situations where the trustor does notdepend on the trustee to perform some functionrelated to the trustorrsquos goals

A simple definition of trust consistent withthese considerations is the attitude that anagent will help achieve an individualrsquos goals ina situation characterized by uncertainty andvulnerability This basic definition must beelaborated to consider the appropriateness oftrust the influence of context the goal-relatedcharacteristics of the agent and the cognitiveprocesses that govern the development anderosion of trust Figure 1 shows how these fac-tors interact in a dynamic process of relianceand the following sections elaborate on variouscomponents of this conceptual model

First we will consider the appropriatenessof trust In Figure 1 appropriateness is shown

Figure1The interaction of context agent characteristics and cognitive properties with the appropriateness of trust

TRUST IN AUTOMATION 55

as the relationship between the true capabili-ties of the agent and the level of trust Secondwe will consider the context defined by thecharacteristics of the individual organizationand culture Figure 1 shows this as a collectionof factors at the top of the diagram each affect-ing a different element of the belief-attitude-intention-behavior sequence associated withreliance on an agent Third we will describe thebasis of trust This defines the different types ofinformation needed to maintain an appropriatelevel of trust which the bottom of Figure 1shows in terms of the purpose process andperformance dimensions that describe the goal-oriented characteristics of the agent Finallyregarding the cognitive process governingtrust trust depends on the interplay among theanalytic analogical and affective processes Ineach of these sections we will cite literatureregarding the organizational sociological inter-personal psychological and neurological per-spectives of human-human trust and then drawparallels with human-automation trust Theunderstanding of trust in automation can bene-fit from a multidisciplinary consideration ofhow context agent characteristics and cognitiveprocesses affect the appropriateness of trust

Appropriate Trust CalibrationResolution and Specificity

Inappropriate reliance associated with mis-use and disuse depends in part on how welltrust matches the true capabilities of the auto-mation Supporting appropriate trust is criticalin avoiding misuse and disuse of automationjust as it is in facilitating effective interpersonalrelationships (Wicks Berman amp Jones 1999)

Calibration resolution and specificity of trustdescribe mismatches between trust and the capa-bilities of automation Calibration refers to thecorrespondence between a personrsquos trust inthe automation and the automationrsquos capabili-ties (Lee amp Moray 1994 Muir 1987) The defi-nitions of appropriate calibration of trust parallelthose of misuse and disuse in describing appro-priate reliance Overtrust is poor calibration inwhich trust exceeds system capabilities withdistrust trust falls short of the automationrsquos ca-pabilities In Figure 2 good calibration is rep-resented by the diagonal line where the levelof trust matches automation capabilities Abovethis line is overtrust and below it is distrust

Resolution refers to how precisely a judg-ment of trust differentiates levels of automation

Automation capability (trustworthiness)

Trust

Overtrust Trust exceeds system capabilities leading to misuse

Distrust Trust falls short of system capabilities leading to disuse

Calibrated trust Trust matches system capabilities leading to appropriate use

Poor resolution A large range of system capability maps onto a small range of trust

Good resolution A range of system capability maps onto the same range of trust

Figure 2 The relationship among calibration resolution and automation capability in defining appropriatetrust in automation Overtrust may lead to misuse and distrust may lead to disuse

56 Spring 2004 ndash Human Factors

capability (Cohen et al 1999) Figure 2 showsthat poor resolution occurs when a large rangeof automation capability maps onto a smallrange of trust With low resolution large chang-es in automation capability are reflected in smallchanges in trust Specificity refers to the degreeto which trust is associated with a particularcomponent or aspect of the trustee Functionalspecificity describes the differentiation of func-tions subfunctions and modes of automationWith high functional specificity a personrsquos trustreflects capabilities of specific subfunctionsand modes Low functional specificity meansthe personrsquos trust reflects the capabilities of theentire system

Specificity can also describe changes in trustas a function of the situation or over time Hightemporal specificity means that a personrsquos trustreflects moment-to-moment fluctuations inautomation capability whereas low temporalspecificity means that the trust reflects onlylong-term changes in automation capabilityAlthough temporal specificity implies a genericchange over time as the personrsquos trust adjuststo failures with the automation temporal speci-ficity also addresses adjustments that shouldoccur when the situation or context changesand affects the capability of the automationTemporal specificity reflects the sensitivity oftrust to changes in context that affect automa-tion capability High functional and temporalspecificity increase the likelihood that the levelof trust will match the capabilities of a particu-lar element of the automation at a particulartime Good calibration high resolution andhigh specificity of trust can mitigate misuse anddisuse of automation and so they can guide de-sign evaluation and training to enhance human-automation partnerships

Individual Organizational and CulturalContext

Trust does not develop in a vacuum but in-stead evolves in a complex individual culturaland organizational context The individual con-text includes individual differences such as thepropensity to trust These differences influencethe initial level of trust and influence how newinformation is interpreted The individual con-text also includes a personrsquos specific history of

interactions that have led to a particular level oftrust The organizational context can also havea strong influence on trust The organizationalcontext reflects the interactions between peoplethat inform them about the trustworthiness ofothers which can include reputation and gossipThe cultural context also influences trust throughsocial norms and expectations Understandingtrust requires a careful consideration of the indi-vidual organizational and cultural context

Systematic individual differences influencetrust between people Some people are moreinclined to trust than are others (Gaines et al1997 Stack 1978) Rotter (1967) defined trustas an enduring personality trait This concep-tion of trust follows a social learning theory ap-proach in which expectations for a particularsituation are determined by specific previousexperiences with situations that are perceivedto be similar (Rotter 1971) People develop be-liefs about others that are generalized and ex-trapolated from one interaction to another Inthis context trust is a generalized expectancythat is independent of specific experiences andbased on the generalization of a large numberof diverse experiences Individual differencesregarding trust have important implications forthe study of human-automation trust becausethey may influence reliance in ways that arenot directly related to the characteristics of theautomation

Substantial evidence demonstrates that thetendency to trust considered as a personalitytrait can be reliably measured and can influencebehavior in a systematic manner For exampleRotterrsquos (1980) Interpersonal Trust Scale reli-ably differentiates people on their propensity totrust others with an internal consistency of 76and a test-retest reliability of 56 with 7 monthsbetween tests Interestingly high-trust individ-uals are not more gullible than low-trust indi-viduals (Gurtman 1992) and the propensity totrust is not correlated with measures of intellect(Rotter 1980) however high-trust individualsare viewed as more trustworthy by others anddisplay more truthful behavior (Rotter1971) Infact people with a high propensity to trust pre-dicted othersrsquo trustworthiness better than thosewith a low propensity to trust (Kikuchi Wanta-nabe amp Yamasishi 1996) Likewise low- andhigh-trust individuals respond differently to

TRUST IN AUTOMATION 57

feedback regarding collaboratorsrsquo intentions andto situational risk (Kramer 1999)

These findings may explain why individualdifferences in the general tendency to trust auto-mation as measured by a complacency scale(Parasuraman Singh Molloy amp Parasuraman1992) are not clearly related to misuse of auto-mation For example Singh Molloy and Para-suraman (1993) found that high-complacencyindividuals actually detected more automationfailures in a constant reliability condition (534compared with 187 for low-complacencyindividuals) This unexpected result is similarto findings in studies of human trust in whichhighly trusting individuals were found to trustmore appropriately Several studies of trust inautomation show that for some people trustchanges substantially as the capability of theautomation changes and for other people trustchanges relatively little (Lee amp Moray1994 Ma-salonis 2000) One possible explanation thatmerits further investigation is that high-trustindividuals may be better able to adjust theirtrust to situations in which the automation ishighly capable as well as to situations in whichit is not

In contrast to the description of trust as astable personality trait most researchers havefocused on trust as an attitude (Jones amp George1998) In describing interpersonal relationshipstrust has been considered as a dynamic attitudethat evolves along with the developing rela-tionship (Rempel et al 1985) The influenceof individual differences regarding the predis-position to trust is most important when a situa-tion is ambiguous and generalized expectanciesdominate and it becomes less important as therelationship progresses (McKnight Cummingsamp Chervany 1998) Trust as an attitude is ahistory-dependent variable that depends on theprior behavior of the trusted person and the in-formation that is shared (Deutsch 1958) Theinitial level of trust is determined by past expe-riences in similar situations some of these ex-periences may even be indirect as in the caseof gossip

The organizational context influences howreputation gossip and formal and informalroles affect the trust of people who have neverhad any direct contact with the trustee (Burt ampKnez 1996 Kramer 1999) For example engi-

neers are trusted not because of the ability ofany specific person but because of the underly-ing education and regulatory structure that gov-erns people in the role of an engineer (Kramer1999) The degree to which the organizationalcontext affects the indirect development of trustdepends on the strength of links in the socialnetwork and on the trustorrsquos ability to establishlinks between the situations experienced by oth-ers and the situations he or she is confronting(Doney et al 1998) These findings suggestthat the organizational context and the indirectexposure that it facilitates can have an impor-tant influence on trust and reliance

Beyond the individual and organizationalcontext culture is another element of contextthat can influence trust and its development(Baba Falkenburg amp Hill 1996) Culture canbe defined as a set of social norms and expec-tations that reflect shared educational and lifeexperiences associated with national differencesor distinct cohorts of workers In particularJapanese citizens have been found to have a gen-erally low level of trust (Yamagishi amp Yama-gishi 1994) In Japan networks of mutuallycommitted relations play a more important rolein governing exchange relationships than theydo in the United States A cross-national ques-tionnaire that included 1136 Japanese and 501American respondents showed that the Ameri-can respondents were more trusting of otherpeople in general and considered reputation moreimportant In contrast the Japanese respondentsplaced a higher value on exchange relationshipsthat are based on personal relationships Oneexplanation for this is that the extreme socialstability of mutually committed relationships in Japan reduces uncertainty about transactionsand diminishes the role of trust (Doney et al1998)

More generally cultural differences associat-ed with power distance (eg dependence onauthority and respect for authoritarian norms)uncertainty avoidance and individualist andcollectivist attitudes can influence the develop-ment and role of trust (Doney et al 1998) Cul-tural variation can influence trust in an on-lineenvironment Karvonen (2001) compared trustin E-commerce service among consumers fromFinland Sweden and Iceland Although alltrusted a simple design there were substantial

58 Spring 2004 ndash Human Factors

differences in the initial trust in a complexdesign with Finnish customers being the mostwary and Icelandic customers the most trustingMore generally the level of social trust variessubstantially among countries and this variationaccounts for over 64 of the variance in the lev-el of Internet adoption (Huang Keser Lelandamp Shachat 2002) The culturally dependentnature of trust suggests that findings regardingtrust in automation may need to be verifiedwhen they are extrapolated from one culturalsetting to another

Cultural context can also describe systematicdifferences in groups of workers For exam-ple in the context of trust in automation Riley(1996) found that pilots who are accustomedto using automation trusted and relied on auto-mation more than did students who were notas familiar with automation Likewise in fieldstudies of operators adapting to computerizedmanufacturing systems Zuboff (1988) foundthat the culture associated with those operatorswho had been exposed to computer systems ledto greater trust and acceptance of automationThese results show that trust in automation liketrust in people is culturally influenced in thatit depends on the long-term experiences of agroup of people

Trust between people depends on the individ-ual organizational and cultural context Thiscontext influences trust because it affects initiallevels of trust and how people interpret infor-mation regarding the agent Some evidenceshows that these relationships may also hold fortrust in automation however very little researchhas investigated any of these factors We ad-dress the types of information that form thebasis of trust in the next section

Basis of Trust Levels of Detail and ofAttributional Abstraction

Trust is a multidimensional construct that isbased on trustee characteristics such as motivesintentions and actions (Bhattacharya Devinneyamp Pillutla 1998 Mayer et al 1995) The basisfor trust is the information that informs the per-son about the ability of the trustee to achievethe goals of the trustor Two critical elementsdefine the basis of trust The first is the focusof trust What is to be trusted The second isthe type of information that describes the entity

to be trusted What is the information support-ing trust This information guides expectationsregarding how well the trustee can achieve thetrustorrsquos goals The focus of trust can be de-scribed according to levels of detail and theinformation supporting trust can be describedaccording to three levels of attributional ab-straction

The focus of trust can be considered along adimension defined by the level of detail whichvaries from general trust of institutions andorganizations to trust in a specific agent Withautomation this might correspond to trust in anoverall system of automation as compared withtrust in a particular mode of an automatic con-troller Couch and Jones (1997) considered threelevels of trust ndash generalized network and part-ner trust ndash which are similar to the three levelsidentified by McKnight et al (1998) Generaltrust corresponds to trust in human nature andinstitutions (Barber 1983) Network trust refersto the cluster of acquaintances that constitutesa personrsquos social network whereas partner trustcorresponds to trust in specific persons (Rempelet al 1985)

There does not seem to be a reliable rela-tionship between global and specific trust sug-gesting that this dimension makes importantdistinctions regarding the evolution of trust(Couch amp Jones 1997) For example trust in asupervisor and trust in an organization are dis-tinct and depend on qualitatively different fac-tors (Tan amp Tan 2000) Trust in a supervisordepends on perceived ability integrity and be-nevolence whereas trust in the organizationdepends on more distal variables of perceivedorganizational support and justice General trustis frequently considered to be a personality traitwhereas specific trust is frequently viewed as anoutcome of a specific interaction (Couch ampJones 1997) Several studies show howeverthat general and specific trust are not necessarilytied to the ldquotrait and staterdquo distinctions and mayinstead simply refer to different levels of detail(Sitkin amp Roth 1993) Developing a high degreeof functional specificity in the trust of automa-tion may require specific information for eachlevel of detail of the automation

The basis of trust can be considered along adimension of attributional abstraction whichvaries from demonstrations of competence to

TRUST IN AUTOMATION 59

the intentions of the agent A recent review oftrust literature concluded that three general lev-els summarize the bases of trust ability integrityand benevolence (Mayer et al 1995) Ability isthe group of skills competencies and character-istics that enable the trustee to influence the do-main Integrity is the degree to which the trusteeadheres to a set of principles the trustor findsacceptable Benevolence is the extent to whichthe intents and motivations of the trustee arealigned with those of the trustor

In the context of interpersonal relationshipsRempel et al (1985) viewed trust as an evolvingphenomenon with the basis of trust changing asthe relationship progressed They argued thatpredictability the degree to which future be-havior can be anticipated (and which is similarto ability) forms the basis of trust early in arelationship This is followed by dependabilitywhich is the degree to which behavior is con-sistent and is similar to integrity As the rela-tionship matures the basis of trust ultimatelyshifts to faith which is a more general judgmentthat a person can be relied upon and is similarto benevolence A similar progression emergedin a study of operatorsrsquo adaptation to new tech-nology (Zuboff 1988) Trust in that context de-pended on trial-and-error experience followedby understanding of the technologyrsquos operationand finally faith Lee and Moray (1992) madesimilar distinctions in defining the factors thatinfluence trust in automation They identifiedperformance process and purpose as the gen-eral bases of trust

Performance refers to the current and histor-ical operation of the automation and includescharacteristics such as reliability predictabilityand ability Performance information describeswhat the automation does More specificallyperformance refers to the competency or exper-tise as demonstrated by its ability to achievethe operatorrsquos goals Because performance islinked to the ability to achieve specific goals itdemonstrates the task- and situation-dependentnature of trust This is similar to Sheridanrsquos(1992) concept of robustness as a basis for trustin human-automation relationships The operatorwill tend to trust automation that performs in amanner that reliably achieves his or her goals

Process is the degree to which the automa-tionrsquos algorithms are appropriate for the situation

and able to achieve the operatorrsquos goals Pro-cess information describes how the automationoperates In interpersonal relationships thiscorresponds to the consistency of actions associ-ated with adherence to a set of acceptable prin-ciples (Mayer et al 1995) Process as a basisfor trust reflects a shift away from focus on spe-cific behaviors and toward qualities and char-acteristics attributed to an agent With processtrust is in the agent and not in the specificactions of the agent Because of this the processbasis of trust relies on dispositional attributionsand inferences drawn from the performance ofthe agent In this sense process is similar todependability and integrity Openness the will-ingness to give and receive ideas is anotherelement of the process basis of trust Interesting-ly consistency and openness are more importantfor trust in peers than for trust in supervisors orsubordinates (Schindler amp Thomas 1993)

In the context of automation the processbasis of trust refers to the algorithms and oper-ations that govern the behavior of the auto-mation This concept is similar to Sheridanrsquos(1992) concept of understandability in human-automation relationships The operator willtend to trust the automation if its algorithmscan be understood and seem capable of achiev-ing the operatorrsquos goals in the current situation

Purpose refers to the degree to which theautomation is being used within the realm ofthe designerrsquos intent Purpose describes why theautomation was developed Purpose correspondsto faith and benevolence and reflects the percep-tion that the trustee has a positive orientationtoward the trustor With interpersonal relation-ships the perception of such a positive orienta-tion depends on the intentions and motives ofthe trustee This can take the form of abstractgeneralized value congruence (Sitkin amp Roth1993) which can be described as whether andto what extent the trustee has a motive to lie(Hovland Janis amp Kelly 1953) The purposebasis of trust reflects the attribution of thesecharacteristics to the automation Frequentlywhether or not this attribution takes place willdepend on whether the designerrsquos intent has beencommunicated to the operator If so the opera-tor will tend to trust automation to achieve thegoals it was designed to achieve

Table 1 builds on the work of Mayer et al

60 Spring 2004 ndash Human Factors

TABLE 1 Summary of the Dimensions That Describe the Basis of Trust

Study Basis of Trust Summary Dimension

Barber (1983) Competence PerformancePersistence ProcessFiduciary responsibility Purpose

Butler amp Cantrell (1984) Competence PerformanceIntegrity ProcessConsistency ProcessLoyalty PurposeOpenness Process

Cook amp Wall (1980) Ability PerformanceIntentions Purpose

Deutsch (1960) Ability PerformanceIntentions Purpose

Gabarro (1978) Integrity ProcessMotives PurposeOpenness ProcessDiscreetness ProcessFunctionalspecific competence PerformanceInterpersonal competence PerformanceBusiness sense PerformanceJudgment Performance

Hovland Janis amp Kelly (1953) Expertise PerformanceMotivation to lie Purpose

Jennings (1967) Loyalty PurposePredictability ProcessAccessibility ProcessAvailability Process

Kee amp Knox (1970) Competence PerformanceMotives Purpose

Mayer et al (1995) Ability PerformanceIntegrity ProcessBenevolence Purpose

Mishra (1996) Competency PerformanceReliability PerformanceOpenness ProcessConcern Purpose

Moorman et al (1993) Integrity ProcessWillingness to reduce uncertainty ProcessConfidentiality ProcessExpertise PerformanceTactfulness ProcessSincerity ProcessCongeniality PerformanceTimeliness Performance

Rempel et al (1985) Reliability PerformanceDependability ProcessFaith Purpose

Sitkin amp Roth (1993) Context-specific reliability PerformanceGeneralized value congruence Purpose

Zuboff (1988) Trial and error experience PerformanceUnderstanding ProcessLeap of faith Purpose

TRUST IN AUTOMATION 61

(1995) and shows how the many characteristicsof a trustee that affect trust can be summarizedby the three bases of performance process andpurpose As an example Mishra (1996) com-bined a literature review and interviews with33 managers to identify four dimensions oftrust These dimensions can be linked to thethree bases of trust competency (performance)reliability (performance) openness (process) andconcern (purpose) The dimensions of purposeprocess and performance provide a concise setof distinctions that describe the basis of trustacross a wide range of application domainsThese dimensions clearly identify three typesof goal-oriented information that contribute todeveloping an appropriate level of trust

Trust depends not only on the observationsassociated with these three dimensions but alsoon the inferences drawn among them Observa-tion of performance can support inferences re-garding internal mechanisms associated withthe dimension of process analysis of internalmechanisms can support inferences regardingthe designerrsquos intent associated with the dimen-sion of purpose Likewise the underlying pro-cess can be inferred from knowledge of thedesignerrsquos intent and performance can be esti-mated from an understanding of the underlyingprocess If these inferences support trust it fol-lows that a system design that facilitates themwould promote a more appropriate level of trustIf inferences are inconsistent with observationsthen trust will probably suffer because of apoor correspondence with the observed agentSimilarly if inferences between levels are incon-sistent trust will suffer because of poor coher-ence For example inconsistency between theintentions conveyed by the manager (purposebasis for trust) and the managerrsquos actions (perfor-mance basis of trust) have a particularly negativeeffect on trust (Gabarro 1978) Likewise the ef-fort to demonstrate reliability and encouragetrust by enforcing procedural approaches maynot succeed if value-oriented concerns (egpurpose) are ignored (Sitkin amp Roth 1993)

In conditions where trust is based on severalfactors it can be quite stable or robust in situ-ations where trust depends on a single basis itcan be quite volatile or fragile (McKnight et al1998) Trust that is based on an understandingof the motives of the agent will be less fragile

than trust based only on the reliability of theagentrsquos performance (Rempel et al 1985)These findings help explain why trust in auto-mation can sometimes appear quite robust andat other times quite fragile (Kantowitz Hanow-ski amp Kantowitz 1997b) Both imperfect co-herence between dimensions and imperfectcorrespondence with observations are likely toundermine trust and are critical considerationsfor encouraging appropriate levels of trust

The availability of information at the differ-ent levels of detail and attributional abstractionmay lead to high calibration high resolutionand great temporal and functional specificity oftrust Designing interfaces and training to pro-vide operators with information regarding thepurpose process and performance of automa-tion could enhance the appropriateness of trustHowever the mere availability of informationwill not ensure appropriate trust The informa-tion must be presented in a manner that is con-sistent with the cognitive processes underlyingthe development of trust as described in thefollowing section

Analytic Analogical and AffectiveProcesses Governing Trust

Information that forms the basis of trust canbe assimilated in three qualitatively differentways Some argue that trust is based on an ana-lytic process others argue that trust dependson an analogical process of assessing categorymembership Most importantly trust also seemsto depend on an affective response to the vio-lation or confirmation of implicit expectan-cies Figure 3 shows how these processes mayinteract to influence trust Ultimately trust is anaffective response but it is also influenced byanalytic and analogical processes The bold linesin Figure 3 show that the affective process hasa greater influence on the analytic process thanthe analytic has on the affective (LoewensteinWeber Hsee amp Welch 2001) The roles of ana-lytic analogical and affective processes dependon the evolution of the relationship betweenthe trustor and trustee the information avail-able to the trustor and the way the informationis displayed

A dominant approach to trust in the organi-zational sciences considers trust from a rationalchoice perspective (Hardin 1992) This view

62 Spring 2004 ndash Human Factors

suggests that trust depends on an evaluationmade under uncertainty in which decision mak-ers use their knowledge of the motivations andinterests of the other party to maximize gainsand minimize losses Individuals are assumed tomake choices based on a rationally derived as-sessment of costs and benefits (Lewicki amp Bun-ker 1996) Williamson (1993) argued that trustis primarily analytic in business and commercialrelations and that it represents one aspect ofrational risk analysis Trust accumulates and dis-sipates based on the effect of cumulative inter-actions with the other agent These interactionslead to an accumulation of knowledge that es-tablishes trust through an expected value cal-culation in which the probabilities of variousoutcomes are estimated with increasing experi-ence (Holmes 1991) Trust can also depend onan analysis of how organizational and individualconstraints affect performance For exampletrust can be considered in terms of a rationalargument in which grounds for various conclu-sions are developed and defended according tothe evidence (Cohen 2000)

These approaches provide a useful evaluationof the normative level of trust but they may notdescribe how trust actually develops Analyticapproaches to trust probably overstate cognitivecapacities and understate the influence of affectand role of strategies to accommodate the lim-its of bounded rationality similar to normative

decision-making theories (Janis amp Mann 1977Simon 1957) Descriptive accounts of decisionmaking show that expert decision makers sel-dom engage in conscious calculations or in anexhaustive comparison of alternatives (Cross-man 1974 Klein 1989) The analytic processis similar to knowledge-based performance asdescribed by Rasmussen (1983) in which infor-mation is processed and plans are formulatedand evaluated using a function-based mentalmodel of the system Knowledge-based or ana-lytic processes are probably complemented byless cognitively demanding processes

A less cognitively demanding process is fortrust to develop according to analogical judg-ments that link levels of trust to characteristicsof the agent and environmental context Thesecategories develop through direct observationof the trusted agent intermediaries that conveytheir observations and presumptions based onstandards category membership and procedures(Kramer 1999)

Explicit and implicit rules frequently guideindividual and collective collaborative behavior(Mishra 1996) Sitkin and Roth (1993) illus-trated the importance of rules in their distinctionbetween trust as an institutional arrangementand interpersonal trust Institutional trust de-pends on contracts legal procedures and for-mal rules whereas interpersonal trust is basedon shared values The application of rules in

Figure 3 The interplay among the analytic analogical and affective process underlying trust with the analyticand analogical processes both contributing to the affective process and the affective process also having adominant influence on the analytic and analogical processes

TRUST IN AUTOMATION 63

institution-based trust reflects guarantees andsafety nets based on the organizational con-straints of regulations and legal recourses thatconstrain behavior and make it more possibleto trust (McKnight et al 1998) However whenfundamental values which frequently supportinterpersonal trust are violated legalistic ap-proaches and applications of rules to restoretrust are ineffective and can further underminetrust (Sitkin amp Roth 1993) When rules areconsistent with trustworthy behavior they canincrease peoplersquos expectation of satisfactoryperformance during times of normal operationby pairing situations with rule-based expecta-tions but rules can have a negative effect whensituations diverge from normal operation

Analogical trust can also depend on interme-diaries who can convey information to supportjudgments of trust (Burt amp Knez 1996) suchas reputations and gossip that enable individualsto develop trust without any direct contact Inthe context of trust in automation responsetimes to warnings tend to increase when falsealarms occur This effect was counteracted bygossip that suggested that the rate of false alarmswas lower than it actually was (Bliss Dunn ampFuller 1995) Similarly trust can be based onpresumptions of the trustworthiness of a cate-gory or role of the person rather than on theactual performance (Kramer Brewer amp Hanna1996) If trust is primarily based on rules thatcharacterize performance during normal situa-tions then abnormal situations or erroneouscategory membership might lead to the collapseof trust because the assurances associated withnormal or customary operation are no longervalid Because of this category-based trust canbe fragile particularly when abnormal situationsdisrupt otherwise stable roles For example whenroles were defined by a binding contract trust de-clined substantially when the contracts wereremoved (Malhotra amp Murnighan 2002)

Similar effects may occur with trust in auto-mation Specifically Miller (2002) suggested thatcomputer etiquette may have an important influ-ence on human-computer interaction Etiquettemay influence trust because category member-ship associated with adherence to a particularetiquette helps people to infer how the automa-tion will perform Intentionally developing com-puter etiquette could promote appropriate trust

but it could lead to inappropriate trust if peopleinfer inappropriate category memberships anddevelop distorted expectations regarding thecapability of the automation

These studies suggest that trust in automa-tion may depend on category judgments basedon a variety of sources that include direct andindirect interaction with the automation Be-cause these analogical judgments emerge fromcomplex interactions between people proce-dures and prolonged experience data from lab-oratory experiments may need to be carefullyassessed to determine how well they generalizeto actual operational settings In this way theanalogical process governing trust correspondsvery closely to Rasmussenrsquos (1983) concept ofrule-based behavior in which condition-actionpairings or procedures guide behavior

Analytic and analogical processes do not ac-count for the affective aspects of trust whichrepresent the core influence of trust on behav-ior (Kramer 1999) Emotional responses arecritical because people not only think abouttrust they also feel it (Fine amp Holyfield 1996)Emotions fluctuate over time according to theperformance of the trustee and they can signalinstances where expectations do not conform tothe ongoing experience As trust is betrayedemotions provide signals concerning the chang-ing nature of the situation (Jones amp George1998)

Berscheid (1994) used the concept of auto-matic processing to describe how behaviors thatare relevant to frequently accessed trait con-structs such as trust are likely to be perceivedunder attentional overload situations Becauseof this people process observations of new be-havior according to old constructs withoutalways being aware of doing so Automaticprocessing plays a substantial role in attribu-tional activities with many aspects of causalreasoning occurring outside conscious aware-ness In this way emotions bridge the gaps inrationality and enable people to focus theirlimited attentional capacity (Johnson-Laird ampOately 1992 Loewenstein et al 2001 Simon1967) Because the cognitive complexity ofrelationships can exceed a personrsquos capacity toform a complete mental model people cannotperfectly predict behavior and emotions serveto redistribute cognitive resources and manage

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 3: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

52 Spring 2004 ndash Human Factors

individuals and organizations and even betweenorganizations Specifically trust has been inves-tigated as a critical factor in interpersonal rela-tionships where the focus is often on romanticrelationships (Rempel et al 1985) In exchangerelationships another important research areathe focus is on trust between management andemployees or between supervisors and subor-dinates (Tan amp Tan 2000) Trust has also beenidentified as a critical factor in increasing orga-nizational productivity and strengthening or-ganizational commitment (Nyhan 2000) Trustbetween firms and customers has become animportant consideration in the context of rela-tionship management (Morgan amp Hunt 1994)and Internet commerce (Muller 1996) Research-ers have even considered the issue of trust in thecontext of the relationship between organiza-tions such as those in multinational firms (Ringamp Vandeven 1992) in which cross-disciplinaryand cross-cultural collaboration is critical (DoneyCannon amp Mullen 1998)

Interest in trust has grown dramatically inthe last 5 years as many have come to recognizeits importance in promoting efficient transac-tions and cooperation Trust has emerged as acentral focus of organizational theory (Kramer1999) has been the focus of recent special issuesof the Academy of Management Review (Jonesamp George 1998) and the International Journalof Human-Computer Studies (Corritore Kracheramp Wiedenbeck 2003b) was the topic of a work-shop at the CHI 2001 meeting (Corritore Kra-cher amp Wiedenbeck 2001a) and has been thetopic of books such as that by Kramer and Tyler(1996)

The general theme of the increasing cogni-tive complexity of automation organizationsand interpersonal interactions explains therecent interest in trust Trust tends to be lessimportant in well-structured stable environ-ments such as procedure-based hierarchicalorganizations in which an emphasis on orderand stability minimize transactional uncertain-ty (Moorman Deshpande amp Zaltman 1993)Many organizations however have recentlyadopted agile structures self-directed workgroups matrix structures and complex automa-tion all of which make the workplace increas-ingly complex unstable and uncertain Becausethese changes enable rapid adaptation to change

and accommodate unanticipated variabilitythere is a trend away from well-structuredprocedure-based environments Although thesechanges have the potential to make organiza-tions and individuals more productive and ableto adapt to the unanticipated (Vicente 1999)they also increase cognitive complexity andleave more degrees of freedom for the individualto resolve Trust plays a critical role in peoplersquosability to accommodate the cognitive complexityand uncertainty that accompanies the moveaway from highly structured organizations andsimple technology

Trust helps people to accommodate com-plexity in several ways It supplants supervisionwhen direct observation becomes impracticaland it facilitates choice under uncertainty byacting as a social decision heuristic (Kramer1999) It also reduces uncertainty in gauging theresponses of others thereby guiding appropriatereliance and generating a collaborative advan-tage (Baba 1999 Ostrom 1998) Moreovertrust facilitates decentralization and adaptivebehavior by making it possible to replace fixedprotocols reporting structures and procedureswith goal-related expectations regarding thecapabilities of others The increased complexityand uncertainty that has inspired the recent in-terest in trust in other fields parallels theincreased complexity and sophistication ofautomation Trust in automation guides reliancewhen the complexity of the automation makesa complete understanding impractical and whenthe situation demands adaptive behavior thatprocedures cannot guide For this reason therecent interest in trust in other disciplines pro-vides a rich and appropriate theoretical basefor understanding how trust mediates relianceon complex automation and more generallyhow it affects computer-mediated collaborationthat involves both human and computer agents

Definition of Trust Beliefs AttitudesIntentions and Behavior

Not surprisingly the diverse interest in trusthas generated many definitions This is particu-larly true when considering how trust relates toautomation (Cohen Parasuraman amp Freeman1999 Muir1994) By examining the differencesand common themes of these definitions it is

TRUST IN AUTOMATION 53

possible to identify critical considerations forunderstanding the role of trust in mediatinghuman-automation interaction Some research-ers focus on trust as an attitude or expectationand they tend to define trust in one of the follow-ing ways ldquoexpectancy held by an individual thatthe word promise or written communicationof another can be relied uponrdquo (Rotter 1967p 651) ldquoexpectation related to subjective proba-bility an individual assigns to the occurrence ofsome set of future eventsrdquo (Rempel et al 1985p 96) ldquoexpectation of technically competentrole performancerdquo (Barber 1983 p 14) or ldquoex-pectations of fiduciary obligation and responsi-bility that is the expectation that some othersin our social relationships have moral obligationsand responsibility to demonstrate a special con-cern for othersrsquo interests above their ownrdquo (Bar-ber p 14) These definitions all include theelement of expectation regarding behaviors oroutcomes Clearly trust concerns an expectan-cy or an attitude regarding the likelihood offavorable responses

Another common approach characterizestrust as an intention or willingness to act Thisgoes beyond attitude in that trust is character-ized as an intention to behave in a certain man-ner or to enter into a state of vulnerability Forexample trust has been defined as ldquowillingnessto place oneself in a relationship that establishesor increases vulnerability with the reliance uponsomeone or something to perform as expectedrdquo(Johns 1996 p 81) ldquowillingness to rely on anexchange partner in whom one has confidencerdquo(Moorman et al 1993 p 82) and ldquowillingnessof a party to be vulnerable to the actions ofanother party based on the expectation that theother will perform a particular action impor-tant to the trustor irrespective of the ability tomonitor or control that partyrdquo (Mayer Davisamp Schoorman 1995 p 712)

The definition by Mayer et al (1995) is themost widely used and accepted definition oftrust (Rousseau Sitkin Burt amp Camerer 1998)As of April 2003 the Institute for Scientific In-formation citation database showed 203 cita-tions of this article far more than others on thetopic of trust The definition identifies vulnera-bility as a critical element of trust For trust tobe an important part of a relationship individ-uals must willingly put themselves at risk or in

vulnerable positions by delegating responsibili-ty for actions to another party

Some authors go beyond intention and definetrust as a behavioral result or state of vulnerabil-ity or risk (Deutsch 1960 Meyer 2001) Ac-cording to these definitions trust is the outcomeof actions that place people into certain states orsituations It can be seen as for example ldquoa stateof perceived vulnerability or risk that is derivedfrom an individualrsquos uncertainty regarding themotives intentions and perspective actions ofothers on whom they dependrdquo (Kramer 1999p 571)

These definitions highlight some importantinconsistencies regarding whether trust is abelief attitude intention or behavior Thesedistinctions are of great theoretical importanceas multiple factors mediate the process of trans-lating beliefs and attitudes into behaviors

Ajzen and Fishbein (1980 Fishbein amp Ajzen1975) developed a framework that can helpreconcile these conflicting definitions of trustTheir framework shows that behaviors resultfrom intentions and that intentions are a func-tion of attitudes Attitudes in turn are based onbeliefs According to this framework beliefsand perceptions represent the information basethat determines attitudes The availability ofinformation and the personrsquos experiences influ-ence beliefs An attitude is an affective evalua-tion of beliefs that guides people to adopt aparticular intention Intentions then translateinto behavior according to the environmentaland cognitive constraints a person faces In thecontext of trust and reliance trust is an attitudeand reliance is a behavior This framework keepsbeliefs attitudes intentions and behavior con-ceptually distinct and can help explain the in-fluence of trust on reliance According to thisframework trust affects reliance as an attituderather than as a belief intention or behaviorBeliefs underlie trust and various intentionsand behaviors may result from different levelsof trust

Considering trust as an intention or behaviorhas the potential to confuse its effect with theeffects of other factors that can influence behav-ior such as workload situation awareness andself-confidence of the operator (Lee amp Moray1994 Riley 1994) Trust is not the only factormediating the relationship between beliefs and

54 Spring 2004 ndash Human Factors

behavior Other psychological system and en-vironmental constraints intervene such as whenoperators do not have enough time to engagethe automation even though they trust it andintend to use it or when the effort to engagethe automation outweighs its benefits (Kirlik1993) Trust stands between beliefs about thecharacteristics of the automation and the inten-tion to rely on the automation

Many definitions of trust also indicate theimportance of the goal-oriented nature of trustAlthough many definitions do not identify thisaspect of trust explicitly several mention theability of the trustee to perform an important ac-tion and the expectation of the trustor that thetrustee will perform as expected or can be reliedupon (Gurtman 1992 Johns 1996 Mayer etal 1995) These definitions describe the basisof trust in terms of the performance of an agentthe trustee who furthers the goals of an indi-vidual the trustor In this way trust describes a

relationship that depends on the characteristicsof the trustee the trustor and the goal-relatedcontext of the interaction Trust is not a consid-eration in situations where the trustor does notdepend on the trustee to perform some functionrelated to the trustorrsquos goals

A simple definition of trust consistent withthese considerations is the attitude that anagent will help achieve an individualrsquos goals ina situation characterized by uncertainty andvulnerability This basic definition must beelaborated to consider the appropriateness oftrust the influence of context the goal-relatedcharacteristics of the agent and the cognitiveprocesses that govern the development anderosion of trust Figure 1 shows how these fac-tors interact in a dynamic process of relianceand the following sections elaborate on variouscomponents of this conceptual model

First we will consider the appropriatenessof trust In Figure 1 appropriateness is shown

Figure1The interaction of context agent characteristics and cognitive properties with the appropriateness of trust

TRUST IN AUTOMATION 55

as the relationship between the true capabili-ties of the agent and the level of trust Secondwe will consider the context defined by thecharacteristics of the individual organizationand culture Figure 1 shows this as a collectionof factors at the top of the diagram each affect-ing a different element of the belief-attitude-intention-behavior sequence associated withreliance on an agent Third we will describe thebasis of trust This defines the different types ofinformation needed to maintain an appropriatelevel of trust which the bottom of Figure 1shows in terms of the purpose process andperformance dimensions that describe the goal-oriented characteristics of the agent Finallyregarding the cognitive process governingtrust trust depends on the interplay among theanalytic analogical and affective processes Ineach of these sections we will cite literatureregarding the organizational sociological inter-personal psychological and neurological per-spectives of human-human trust and then drawparallels with human-automation trust Theunderstanding of trust in automation can bene-fit from a multidisciplinary consideration ofhow context agent characteristics and cognitiveprocesses affect the appropriateness of trust

Appropriate Trust CalibrationResolution and Specificity

Inappropriate reliance associated with mis-use and disuse depends in part on how welltrust matches the true capabilities of the auto-mation Supporting appropriate trust is criticalin avoiding misuse and disuse of automationjust as it is in facilitating effective interpersonalrelationships (Wicks Berman amp Jones 1999)

Calibration resolution and specificity of trustdescribe mismatches between trust and the capa-bilities of automation Calibration refers to thecorrespondence between a personrsquos trust inthe automation and the automationrsquos capabili-ties (Lee amp Moray 1994 Muir 1987) The defi-nitions of appropriate calibration of trust parallelthose of misuse and disuse in describing appro-priate reliance Overtrust is poor calibration inwhich trust exceeds system capabilities withdistrust trust falls short of the automationrsquos ca-pabilities In Figure 2 good calibration is rep-resented by the diagonal line where the levelof trust matches automation capabilities Abovethis line is overtrust and below it is distrust

Resolution refers to how precisely a judg-ment of trust differentiates levels of automation

Automation capability (trustworthiness)

Trust

Overtrust Trust exceeds system capabilities leading to misuse

Distrust Trust falls short of system capabilities leading to disuse

Calibrated trust Trust matches system capabilities leading to appropriate use

Poor resolution A large range of system capability maps onto a small range of trust

Good resolution A range of system capability maps onto the same range of trust

Figure 2 The relationship among calibration resolution and automation capability in defining appropriatetrust in automation Overtrust may lead to misuse and distrust may lead to disuse

56 Spring 2004 ndash Human Factors

capability (Cohen et al 1999) Figure 2 showsthat poor resolution occurs when a large rangeof automation capability maps onto a smallrange of trust With low resolution large chang-es in automation capability are reflected in smallchanges in trust Specificity refers to the degreeto which trust is associated with a particularcomponent or aspect of the trustee Functionalspecificity describes the differentiation of func-tions subfunctions and modes of automationWith high functional specificity a personrsquos trustreflects capabilities of specific subfunctionsand modes Low functional specificity meansthe personrsquos trust reflects the capabilities of theentire system

Specificity can also describe changes in trustas a function of the situation or over time Hightemporal specificity means that a personrsquos trustreflects moment-to-moment fluctuations inautomation capability whereas low temporalspecificity means that the trust reflects onlylong-term changes in automation capabilityAlthough temporal specificity implies a genericchange over time as the personrsquos trust adjuststo failures with the automation temporal speci-ficity also addresses adjustments that shouldoccur when the situation or context changesand affects the capability of the automationTemporal specificity reflects the sensitivity oftrust to changes in context that affect automa-tion capability High functional and temporalspecificity increase the likelihood that the levelof trust will match the capabilities of a particu-lar element of the automation at a particulartime Good calibration high resolution andhigh specificity of trust can mitigate misuse anddisuse of automation and so they can guide de-sign evaluation and training to enhance human-automation partnerships

Individual Organizational and CulturalContext

Trust does not develop in a vacuum but in-stead evolves in a complex individual culturaland organizational context The individual con-text includes individual differences such as thepropensity to trust These differences influencethe initial level of trust and influence how newinformation is interpreted The individual con-text also includes a personrsquos specific history of

interactions that have led to a particular level oftrust The organizational context can also havea strong influence on trust The organizationalcontext reflects the interactions between peoplethat inform them about the trustworthiness ofothers which can include reputation and gossipThe cultural context also influences trust throughsocial norms and expectations Understandingtrust requires a careful consideration of the indi-vidual organizational and cultural context

Systematic individual differences influencetrust between people Some people are moreinclined to trust than are others (Gaines et al1997 Stack 1978) Rotter (1967) defined trustas an enduring personality trait This concep-tion of trust follows a social learning theory ap-proach in which expectations for a particularsituation are determined by specific previousexperiences with situations that are perceivedto be similar (Rotter 1971) People develop be-liefs about others that are generalized and ex-trapolated from one interaction to another Inthis context trust is a generalized expectancythat is independent of specific experiences andbased on the generalization of a large numberof diverse experiences Individual differencesregarding trust have important implications forthe study of human-automation trust becausethey may influence reliance in ways that arenot directly related to the characteristics of theautomation

Substantial evidence demonstrates that thetendency to trust considered as a personalitytrait can be reliably measured and can influencebehavior in a systematic manner For exampleRotterrsquos (1980) Interpersonal Trust Scale reli-ably differentiates people on their propensity totrust others with an internal consistency of 76and a test-retest reliability of 56 with 7 monthsbetween tests Interestingly high-trust individ-uals are not more gullible than low-trust indi-viduals (Gurtman 1992) and the propensity totrust is not correlated with measures of intellect(Rotter 1980) however high-trust individualsare viewed as more trustworthy by others anddisplay more truthful behavior (Rotter1971) Infact people with a high propensity to trust pre-dicted othersrsquo trustworthiness better than thosewith a low propensity to trust (Kikuchi Wanta-nabe amp Yamasishi 1996) Likewise low- andhigh-trust individuals respond differently to

TRUST IN AUTOMATION 57

feedback regarding collaboratorsrsquo intentions andto situational risk (Kramer 1999)

These findings may explain why individualdifferences in the general tendency to trust auto-mation as measured by a complacency scale(Parasuraman Singh Molloy amp Parasuraman1992) are not clearly related to misuse of auto-mation For example Singh Molloy and Para-suraman (1993) found that high-complacencyindividuals actually detected more automationfailures in a constant reliability condition (534compared with 187 for low-complacencyindividuals) This unexpected result is similarto findings in studies of human trust in whichhighly trusting individuals were found to trustmore appropriately Several studies of trust inautomation show that for some people trustchanges substantially as the capability of theautomation changes and for other people trustchanges relatively little (Lee amp Moray1994 Ma-salonis 2000) One possible explanation thatmerits further investigation is that high-trustindividuals may be better able to adjust theirtrust to situations in which the automation ishighly capable as well as to situations in whichit is not

In contrast to the description of trust as astable personality trait most researchers havefocused on trust as an attitude (Jones amp George1998) In describing interpersonal relationshipstrust has been considered as a dynamic attitudethat evolves along with the developing rela-tionship (Rempel et al 1985) The influenceof individual differences regarding the predis-position to trust is most important when a situa-tion is ambiguous and generalized expectanciesdominate and it becomes less important as therelationship progresses (McKnight Cummingsamp Chervany 1998) Trust as an attitude is ahistory-dependent variable that depends on theprior behavior of the trusted person and the in-formation that is shared (Deutsch 1958) Theinitial level of trust is determined by past expe-riences in similar situations some of these ex-periences may even be indirect as in the caseof gossip

The organizational context influences howreputation gossip and formal and informalroles affect the trust of people who have neverhad any direct contact with the trustee (Burt ampKnez 1996 Kramer 1999) For example engi-

neers are trusted not because of the ability ofany specific person but because of the underly-ing education and regulatory structure that gov-erns people in the role of an engineer (Kramer1999) The degree to which the organizationalcontext affects the indirect development of trustdepends on the strength of links in the socialnetwork and on the trustorrsquos ability to establishlinks between the situations experienced by oth-ers and the situations he or she is confronting(Doney et al 1998) These findings suggestthat the organizational context and the indirectexposure that it facilitates can have an impor-tant influence on trust and reliance

Beyond the individual and organizationalcontext culture is another element of contextthat can influence trust and its development(Baba Falkenburg amp Hill 1996) Culture canbe defined as a set of social norms and expec-tations that reflect shared educational and lifeexperiences associated with national differencesor distinct cohorts of workers In particularJapanese citizens have been found to have a gen-erally low level of trust (Yamagishi amp Yama-gishi 1994) In Japan networks of mutuallycommitted relations play a more important rolein governing exchange relationships than theydo in the United States A cross-national ques-tionnaire that included 1136 Japanese and 501American respondents showed that the Ameri-can respondents were more trusting of otherpeople in general and considered reputation moreimportant In contrast the Japanese respondentsplaced a higher value on exchange relationshipsthat are based on personal relationships Oneexplanation for this is that the extreme socialstability of mutually committed relationships in Japan reduces uncertainty about transactionsand diminishes the role of trust (Doney et al1998)

More generally cultural differences associat-ed with power distance (eg dependence onauthority and respect for authoritarian norms)uncertainty avoidance and individualist andcollectivist attitudes can influence the develop-ment and role of trust (Doney et al 1998) Cul-tural variation can influence trust in an on-lineenvironment Karvonen (2001) compared trustin E-commerce service among consumers fromFinland Sweden and Iceland Although alltrusted a simple design there were substantial

58 Spring 2004 ndash Human Factors

differences in the initial trust in a complexdesign with Finnish customers being the mostwary and Icelandic customers the most trustingMore generally the level of social trust variessubstantially among countries and this variationaccounts for over 64 of the variance in the lev-el of Internet adoption (Huang Keser Lelandamp Shachat 2002) The culturally dependentnature of trust suggests that findings regardingtrust in automation may need to be verifiedwhen they are extrapolated from one culturalsetting to another

Cultural context can also describe systematicdifferences in groups of workers For exam-ple in the context of trust in automation Riley(1996) found that pilots who are accustomedto using automation trusted and relied on auto-mation more than did students who were notas familiar with automation Likewise in fieldstudies of operators adapting to computerizedmanufacturing systems Zuboff (1988) foundthat the culture associated with those operatorswho had been exposed to computer systems ledto greater trust and acceptance of automationThese results show that trust in automation liketrust in people is culturally influenced in thatit depends on the long-term experiences of agroup of people

Trust between people depends on the individ-ual organizational and cultural context Thiscontext influences trust because it affects initiallevels of trust and how people interpret infor-mation regarding the agent Some evidenceshows that these relationships may also hold fortrust in automation however very little researchhas investigated any of these factors We ad-dress the types of information that form thebasis of trust in the next section

Basis of Trust Levels of Detail and ofAttributional Abstraction

Trust is a multidimensional construct that isbased on trustee characteristics such as motivesintentions and actions (Bhattacharya Devinneyamp Pillutla 1998 Mayer et al 1995) The basisfor trust is the information that informs the per-son about the ability of the trustee to achievethe goals of the trustor Two critical elementsdefine the basis of trust The first is the focusof trust What is to be trusted The second isthe type of information that describes the entity

to be trusted What is the information support-ing trust This information guides expectationsregarding how well the trustee can achieve thetrustorrsquos goals The focus of trust can be de-scribed according to levels of detail and theinformation supporting trust can be describedaccording to three levels of attributional ab-straction

The focus of trust can be considered along adimension defined by the level of detail whichvaries from general trust of institutions andorganizations to trust in a specific agent Withautomation this might correspond to trust in anoverall system of automation as compared withtrust in a particular mode of an automatic con-troller Couch and Jones (1997) considered threelevels of trust ndash generalized network and part-ner trust ndash which are similar to the three levelsidentified by McKnight et al (1998) Generaltrust corresponds to trust in human nature andinstitutions (Barber 1983) Network trust refersto the cluster of acquaintances that constitutesa personrsquos social network whereas partner trustcorresponds to trust in specific persons (Rempelet al 1985)

There does not seem to be a reliable rela-tionship between global and specific trust sug-gesting that this dimension makes importantdistinctions regarding the evolution of trust(Couch amp Jones 1997) For example trust in asupervisor and trust in an organization are dis-tinct and depend on qualitatively different fac-tors (Tan amp Tan 2000) Trust in a supervisordepends on perceived ability integrity and be-nevolence whereas trust in the organizationdepends on more distal variables of perceivedorganizational support and justice General trustis frequently considered to be a personality traitwhereas specific trust is frequently viewed as anoutcome of a specific interaction (Couch ampJones 1997) Several studies show howeverthat general and specific trust are not necessarilytied to the ldquotrait and staterdquo distinctions and mayinstead simply refer to different levels of detail(Sitkin amp Roth 1993) Developing a high degreeof functional specificity in the trust of automa-tion may require specific information for eachlevel of detail of the automation

The basis of trust can be considered along adimension of attributional abstraction whichvaries from demonstrations of competence to

TRUST IN AUTOMATION 59

the intentions of the agent A recent review oftrust literature concluded that three general lev-els summarize the bases of trust ability integrityand benevolence (Mayer et al 1995) Ability isthe group of skills competencies and character-istics that enable the trustee to influence the do-main Integrity is the degree to which the trusteeadheres to a set of principles the trustor findsacceptable Benevolence is the extent to whichthe intents and motivations of the trustee arealigned with those of the trustor

In the context of interpersonal relationshipsRempel et al (1985) viewed trust as an evolvingphenomenon with the basis of trust changing asthe relationship progressed They argued thatpredictability the degree to which future be-havior can be anticipated (and which is similarto ability) forms the basis of trust early in arelationship This is followed by dependabilitywhich is the degree to which behavior is con-sistent and is similar to integrity As the rela-tionship matures the basis of trust ultimatelyshifts to faith which is a more general judgmentthat a person can be relied upon and is similarto benevolence A similar progression emergedin a study of operatorsrsquo adaptation to new tech-nology (Zuboff 1988) Trust in that context de-pended on trial-and-error experience followedby understanding of the technologyrsquos operationand finally faith Lee and Moray (1992) madesimilar distinctions in defining the factors thatinfluence trust in automation They identifiedperformance process and purpose as the gen-eral bases of trust

Performance refers to the current and histor-ical operation of the automation and includescharacteristics such as reliability predictabilityand ability Performance information describeswhat the automation does More specificallyperformance refers to the competency or exper-tise as demonstrated by its ability to achievethe operatorrsquos goals Because performance islinked to the ability to achieve specific goals itdemonstrates the task- and situation-dependentnature of trust This is similar to Sheridanrsquos(1992) concept of robustness as a basis for trustin human-automation relationships The operatorwill tend to trust automation that performs in amanner that reliably achieves his or her goals

Process is the degree to which the automa-tionrsquos algorithms are appropriate for the situation

and able to achieve the operatorrsquos goals Pro-cess information describes how the automationoperates In interpersonal relationships thiscorresponds to the consistency of actions associ-ated with adherence to a set of acceptable prin-ciples (Mayer et al 1995) Process as a basisfor trust reflects a shift away from focus on spe-cific behaviors and toward qualities and char-acteristics attributed to an agent With processtrust is in the agent and not in the specificactions of the agent Because of this the processbasis of trust relies on dispositional attributionsand inferences drawn from the performance ofthe agent In this sense process is similar todependability and integrity Openness the will-ingness to give and receive ideas is anotherelement of the process basis of trust Interesting-ly consistency and openness are more importantfor trust in peers than for trust in supervisors orsubordinates (Schindler amp Thomas 1993)

In the context of automation the processbasis of trust refers to the algorithms and oper-ations that govern the behavior of the auto-mation This concept is similar to Sheridanrsquos(1992) concept of understandability in human-automation relationships The operator willtend to trust the automation if its algorithmscan be understood and seem capable of achiev-ing the operatorrsquos goals in the current situation

Purpose refers to the degree to which theautomation is being used within the realm ofthe designerrsquos intent Purpose describes why theautomation was developed Purpose correspondsto faith and benevolence and reflects the percep-tion that the trustee has a positive orientationtoward the trustor With interpersonal relation-ships the perception of such a positive orienta-tion depends on the intentions and motives ofthe trustee This can take the form of abstractgeneralized value congruence (Sitkin amp Roth1993) which can be described as whether andto what extent the trustee has a motive to lie(Hovland Janis amp Kelly 1953) The purposebasis of trust reflects the attribution of thesecharacteristics to the automation Frequentlywhether or not this attribution takes place willdepend on whether the designerrsquos intent has beencommunicated to the operator If so the opera-tor will tend to trust automation to achieve thegoals it was designed to achieve

Table 1 builds on the work of Mayer et al

60 Spring 2004 ndash Human Factors

TABLE 1 Summary of the Dimensions That Describe the Basis of Trust

Study Basis of Trust Summary Dimension

Barber (1983) Competence PerformancePersistence ProcessFiduciary responsibility Purpose

Butler amp Cantrell (1984) Competence PerformanceIntegrity ProcessConsistency ProcessLoyalty PurposeOpenness Process

Cook amp Wall (1980) Ability PerformanceIntentions Purpose

Deutsch (1960) Ability PerformanceIntentions Purpose

Gabarro (1978) Integrity ProcessMotives PurposeOpenness ProcessDiscreetness ProcessFunctionalspecific competence PerformanceInterpersonal competence PerformanceBusiness sense PerformanceJudgment Performance

Hovland Janis amp Kelly (1953) Expertise PerformanceMotivation to lie Purpose

Jennings (1967) Loyalty PurposePredictability ProcessAccessibility ProcessAvailability Process

Kee amp Knox (1970) Competence PerformanceMotives Purpose

Mayer et al (1995) Ability PerformanceIntegrity ProcessBenevolence Purpose

Mishra (1996) Competency PerformanceReliability PerformanceOpenness ProcessConcern Purpose

Moorman et al (1993) Integrity ProcessWillingness to reduce uncertainty ProcessConfidentiality ProcessExpertise PerformanceTactfulness ProcessSincerity ProcessCongeniality PerformanceTimeliness Performance

Rempel et al (1985) Reliability PerformanceDependability ProcessFaith Purpose

Sitkin amp Roth (1993) Context-specific reliability PerformanceGeneralized value congruence Purpose

Zuboff (1988) Trial and error experience PerformanceUnderstanding ProcessLeap of faith Purpose

TRUST IN AUTOMATION 61

(1995) and shows how the many characteristicsof a trustee that affect trust can be summarizedby the three bases of performance process andpurpose As an example Mishra (1996) com-bined a literature review and interviews with33 managers to identify four dimensions oftrust These dimensions can be linked to thethree bases of trust competency (performance)reliability (performance) openness (process) andconcern (purpose) The dimensions of purposeprocess and performance provide a concise setof distinctions that describe the basis of trustacross a wide range of application domainsThese dimensions clearly identify three typesof goal-oriented information that contribute todeveloping an appropriate level of trust

Trust depends not only on the observationsassociated with these three dimensions but alsoon the inferences drawn among them Observa-tion of performance can support inferences re-garding internal mechanisms associated withthe dimension of process analysis of internalmechanisms can support inferences regardingthe designerrsquos intent associated with the dimen-sion of purpose Likewise the underlying pro-cess can be inferred from knowledge of thedesignerrsquos intent and performance can be esti-mated from an understanding of the underlyingprocess If these inferences support trust it fol-lows that a system design that facilitates themwould promote a more appropriate level of trustIf inferences are inconsistent with observationsthen trust will probably suffer because of apoor correspondence with the observed agentSimilarly if inferences between levels are incon-sistent trust will suffer because of poor coher-ence For example inconsistency between theintentions conveyed by the manager (purposebasis for trust) and the managerrsquos actions (perfor-mance basis of trust) have a particularly negativeeffect on trust (Gabarro 1978) Likewise the ef-fort to demonstrate reliability and encouragetrust by enforcing procedural approaches maynot succeed if value-oriented concerns (egpurpose) are ignored (Sitkin amp Roth 1993)

In conditions where trust is based on severalfactors it can be quite stable or robust in situ-ations where trust depends on a single basis itcan be quite volatile or fragile (McKnight et al1998) Trust that is based on an understandingof the motives of the agent will be less fragile

than trust based only on the reliability of theagentrsquos performance (Rempel et al 1985)These findings help explain why trust in auto-mation can sometimes appear quite robust andat other times quite fragile (Kantowitz Hanow-ski amp Kantowitz 1997b) Both imperfect co-herence between dimensions and imperfectcorrespondence with observations are likely toundermine trust and are critical considerationsfor encouraging appropriate levels of trust

The availability of information at the differ-ent levels of detail and attributional abstractionmay lead to high calibration high resolutionand great temporal and functional specificity oftrust Designing interfaces and training to pro-vide operators with information regarding thepurpose process and performance of automa-tion could enhance the appropriateness of trustHowever the mere availability of informationwill not ensure appropriate trust The informa-tion must be presented in a manner that is con-sistent with the cognitive processes underlyingthe development of trust as described in thefollowing section

Analytic Analogical and AffectiveProcesses Governing Trust

Information that forms the basis of trust canbe assimilated in three qualitatively differentways Some argue that trust is based on an ana-lytic process others argue that trust dependson an analogical process of assessing categorymembership Most importantly trust also seemsto depend on an affective response to the vio-lation or confirmation of implicit expectan-cies Figure 3 shows how these processes mayinteract to influence trust Ultimately trust is anaffective response but it is also influenced byanalytic and analogical processes The bold linesin Figure 3 show that the affective process hasa greater influence on the analytic process thanthe analytic has on the affective (LoewensteinWeber Hsee amp Welch 2001) The roles of ana-lytic analogical and affective processes dependon the evolution of the relationship betweenthe trustor and trustee the information avail-able to the trustor and the way the informationis displayed

A dominant approach to trust in the organi-zational sciences considers trust from a rationalchoice perspective (Hardin 1992) This view

62 Spring 2004 ndash Human Factors

suggests that trust depends on an evaluationmade under uncertainty in which decision mak-ers use their knowledge of the motivations andinterests of the other party to maximize gainsand minimize losses Individuals are assumed tomake choices based on a rationally derived as-sessment of costs and benefits (Lewicki amp Bun-ker 1996) Williamson (1993) argued that trustis primarily analytic in business and commercialrelations and that it represents one aspect ofrational risk analysis Trust accumulates and dis-sipates based on the effect of cumulative inter-actions with the other agent These interactionslead to an accumulation of knowledge that es-tablishes trust through an expected value cal-culation in which the probabilities of variousoutcomes are estimated with increasing experi-ence (Holmes 1991) Trust can also depend onan analysis of how organizational and individualconstraints affect performance For exampletrust can be considered in terms of a rationalargument in which grounds for various conclu-sions are developed and defended according tothe evidence (Cohen 2000)

These approaches provide a useful evaluationof the normative level of trust but they may notdescribe how trust actually develops Analyticapproaches to trust probably overstate cognitivecapacities and understate the influence of affectand role of strategies to accommodate the lim-its of bounded rationality similar to normative

decision-making theories (Janis amp Mann 1977Simon 1957) Descriptive accounts of decisionmaking show that expert decision makers sel-dom engage in conscious calculations or in anexhaustive comparison of alternatives (Cross-man 1974 Klein 1989) The analytic processis similar to knowledge-based performance asdescribed by Rasmussen (1983) in which infor-mation is processed and plans are formulatedand evaluated using a function-based mentalmodel of the system Knowledge-based or ana-lytic processes are probably complemented byless cognitively demanding processes

A less cognitively demanding process is fortrust to develop according to analogical judg-ments that link levels of trust to characteristicsof the agent and environmental context Thesecategories develop through direct observationof the trusted agent intermediaries that conveytheir observations and presumptions based onstandards category membership and procedures(Kramer 1999)

Explicit and implicit rules frequently guideindividual and collective collaborative behavior(Mishra 1996) Sitkin and Roth (1993) illus-trated the importance of rules in their distinctionbetween trust as an institutional arrangementand interpersonal trust Institutional trust de-pends on contracts legal procedures and for-mal rules whereas interpersonal trust is basedon shared values The application of rules in

Figure 3 The interplay among the analytic analogical and affective process underlying trust with the analyticand analogical processes both contributing to the affective process and the affective process also having adominant influence on the analytic and analogical processes

TRUST IN AUTOMATION 63

institution-based trust reflects guarantees andsafety nets based on the organizational con-straints of regulations and legal recourses thatconstrain behavior and make it more possibleto trust (McKnight et al 1998) However whenfundamental values which frequently supportinterpersonal trust are violated legalistic ap-proaches and applications of rules to restoretrust are ineffective and can further underminetrust (Sitkin amp Roth 1993) When rules areconsistent with trustworthy behavior they canincrease peoplersquos expectation of satisfactoryperformance during times of normal operationby pairing situations with rule-based expecta-tions but rules can have a negative effect whensituations diverge from normal operation

Analogical trust can also depend on interme-diaries who can convey information to supportjudgments of trust (Burt amp Knez 1996) suchas reputations and gossip that enable individualsto develop trust without any direct contact Inthe context of trust in automation responsetimes to warnings tend to increase when falsealarms occur This effect was counteracted bygossip that suggested that the rate of false alarmswas lower than it actually was (Bliss Dunn ampFuller 1995) Similarly trust can be based onpresumptions of the trustworthiness of a cate-gory or role of the person rather than on theactual performance (Kramer Brewer amp Hanna1996) If trust is primarily based on rules thatcharacterize performance during normal situa-tions then abnormal situations or erroneouscategory membership might lead to the collapseof trust because the assurances associated withnormal or customary operation are no longervalid Because of this category-based trust canbe fragile particularly when abnormal situationsdisrupt otherwise stable roles For example whenroles were defined by a binding contract trust de-clined substantially when the contracts wereremoved (Malhotra amp Murnighan 2002)

Similar effects may occur with trust in auto-mation Specifically Miller (2002) suggested thatcomputer etiquette may have an important influ-ence on human-computer interaction Etiquettemay influence trust because category member-ship associated with adherence to a particularetiquette helps people to infer how the automa-tion will perform Intentionally developing com-puter etiquette could promote appropriate trust

but it could lead to inappropriate trust if peopleinfer inappropriate category memberships anddevelop distorted expectations regarding thecapability of the automation

These studies suggest that trust in automa-tion may depend on category judgments basedon a variety of sources that include direct andindirect interaction with the automation Be-cause these analogical judgments emerge fromcomplex interactions between people proce-dures and prolonged experience data from lab-oratory experiments may need to be carefullyassessed to determine how well they generalizeto actual operational settings In this way theanalogical process governing trust correspondsvery closely to Rasmussenrsquos (1983) concept ofrule-based behavior in which condition-actionpairings or procedures guide behavior

Analytic and analogical processes do not ac-count for the affective aspects of trust whichrepresent the core influence of trust on behav-ior (Kramer 1999) Emotional responses arecritical because people not only think abouttrust they also feel it (Fine amp Holyfield 1996)Emotions fluctuate over time according to theperformance of the trustee and they can signalinstances where expectations do not conform tothe ongoing experience As trust is betrayedemotions provide signals concerning the chang-ing nature of the situation (Jones amp George1998)

Berscheid (1994) used the concept of auto-matic processing to describe how behaviors thatare relevant to frequently accessed trait con-structs such as trust are likely to be perceivedunder attentional overload situations Becauseof this people process observations of new be-havior according to old constructs withoutalways being aware of doing so Automaticprocessing plays a substantial role in attribu-tional activities with many aspects of causalreasoning occurring outside conscious aware-ness In this way emotions bridge the gaps inrationality and enable people to focus theirlimited attentional capacity (Johnson-Laird ampOately 1992 Loewenstein et al 2001 Simon1967) Because the cognitive complexity ofrelationships can exceed a personrsquos capacity toform a complete mental model people cannotperfectly predict behavior and emotions serveto redistribute cognitive resources and manage

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 4: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 53

possible to identify critical considerations forunderstanding the role of trust in mediatinghuman-automation interaction Some research-ers focus on trust as an attitude or expectationand they tend to define trust in one of the follow-ing ways ldquoexpectancy held by an individual thatthe word promise or written communicationof another can be relied uponrdquo (Rotter 1967p 651) ldquoexpectation related to subjective proba-bility an individual assigns to the occurrence ofsome set of future eventsrdquo (Rempel et al 1985p 96) ldquoexpectation of technically competentrole performancerdquo (Barber 1983 p 14) or ldquoex-pectations of fiduciary obligation and responsi-bility that is the expectation that some othersin our social relationships have moral obligationsand responsibility to demonstrate a special con-cern for othersrsquo interests above their ownrdquo (Bar-ber p 14) These definitions all include theelement of expectation regarding behaviors oroutcomes Clearly trust concerns an expectan-cy or an attitude regarding the likelihood offavorable responses

Another common approach characterizestrust as an intention or willingness to act Thisgoes beyond attitude in that trust is character-ized as an intention to behave in a certain man-ner or to enter into a state of vulnerability Forexample trust has been defined as ldquowillingnessto place oneself in a relationship that establishesor increases vulnerability with the reliance uponsomeone or something to perform as expectedrdquo(Johns 1996 p 81) ldquowillingness to rely on anexchange partner in whom one has confidencerdquo(Moorman et al 1993 p 82) and ldquowillingnessof a party to be vulnerable to the actions ofanother party based on the expectation that theother will perform a particular action impor-tant to the trustor irrespective of the ability tomonitor or control that partyrdquo (Mayer Davisamp Schoorman 1995 p 712)

The definition by Mayer et al (1995) is themost widely used and accepted definition oftrust (Rousseau Sitkin Burt amp Camerer 1998)As of April 2003 the Institute for Scientific In-formation citation database showed 203 cita-tions of this article far more than others on thetopic of trust The definition identifies vulnera-bility as a critical element of trust For trust tobe an important part of a relationship individ-uals must willingly put themselves at risk or in

vulnerable positions by delegating responsibili-ty for actions to another party

Some authors go beyond intention and definetrust as a behavioral result or state of vulnerabil-ity or risk (Deutsch 1960 Meyer 2001) Ac-cording to these definitions trust is the outcomeof actions that place people into certain states orsituations It can be seen as for example ldquoa stateof perceived vulnerability or risk that is derivedfrom an individualrsquos uncertainty regarding themotives intentions and perspective actions ofothers on whom they dependrdquo (Kramer 1999p 571)

These definitions highlight some importantinconsistencies regarding whether trust is abelief attitude intention or behavior Thesedistinctions are of great theoretical importanceas multiple factors mediate the process of trans-lating beliefs and attitudes into behaviors

Ajzen and Fishbein (1980 Fishbein amp Ajzen1975) developed a framework that can helpreconcile these conflicting definitions of trustTheir framework shows that behaviors resultfrom intentions and that intentions are a func-tion of attitudes Attitudes in turn are based onbeliefs According to this framework beliefsand perceptions represent the information basethat determines attitudes The availability ofinformation and the personrsquos experiences influ-ence beliefs An attitude is an affective evalua-tion of beliefs that guides people to adopt aparticular intention Intentions then translateinto behavior according to the environmentaland cognitive constraints a person faces In thecontext of trust and reliance trust is an attitudeand reliance is a behavior This framework keepsbeliefs attitudes intentions and behavior con-ceptually distinct and can help explain the in-fluence of trust on reliance According to thisframework trust affects reliance as an attituderather than as a belief intention or behaviorBeliefs underlie trust and various intentionsand behaviors may result from different levelsof trust

Considering trust as an intention or behaviorhas the potential to confuse its effect with theeffects of other factors that can influence behav-ior such as workload situation awareness andself-confidence of the operator (Lee amp Moray1994 Riley 1994) Trust is not the only factormediating the relationship between beliefs and

54 Spring 2004 ndash Human Factors

behavior Other psychological system and en-vironmental constraints intervene such as whenoperators do not have enough time to engagethe automation even though they trust it andintend to use it or when the effort to engagethe automation outweighs its benefits (Kirlik1993) Trust stands between beliefs about thecharacteristics of the automation and the inten-tion to rely on the automation

Many definitions of trust also indicate theimportance of the goal-oriented nature of trustAlthough many definitions do not identify thisaspect of trust explicitly several mention theability of the trustee to perform an important ac-tion and the expectation of the trustor that thetrustee will perform as expected or can be reliedupon (Gurtman 1992 Johns 1996 Mayer etal 1995) These definitions describe the basisof trust in terms of the performance of an agentthe trustee who furthers the goals of an indi-vidual the trustor In this way trust describes a

relationship that depends on the characteristicsof the trustee the trustor and the goal-relatedcontext of the interaction Trust is not a consid-eration in situations where the trustor does notdepend on the trustee to perform some functionrelated to the trustorrsquos goals

A simple definition of trust consistent withthese considerations is the attitude that anagent will help achieve an individualrsquos goals ina situation characterized by uncertainty andvulnerability This basic definition must beelaborated to consider the appropriateness oftrust the influence of context the goal-relatedcharacteristics of the agent and the cognitiveprocesses that govern the development anderosion of trust Figure 1 shows how these fac-tors interact in a dynamic process of relianceand the following sections elaborate on variouscomponents of this conceptual model

First we will consider the appropriatenessof trust In Figure 1 appropriateness is shown

Figure1The interaction of context agent characteristics and cognitive properties with the appropriateness of trust

TRUST IN AUTOMATION 55

as the relationship between the true capabili-ties of the agent and the level of trust Secondwe will consider the context defined by thecharacteristics of the individual organizationand culture Figure 1 shows this as a collectionof factors at the top of the diagram each affect-ing a different element of the belief-attitude-intention-behavior sequence associated withreliance on an agent Third we will describe thebasis of trust This defines the different types ofinformation needed to maintain an appropriatelevel of trust which the bottom of Figure 1shows in terms of the purpose process andperformance dimensions that describe the goal-oriented characteristics of the agent Finallyregarding the cognitive process governingtrust trust depends on the interplay among theanalytic analogical and affective processes Ineach of these sections we will cite literatureregarding the organizational sociological inter-personal psychological and neurological per-spectives of human-human trust and then drawparallels with human-automation trust Theunderstanding of trust in automation can bene-fit from a multidisciplinary consideration ofhow context agent characteristics and cognitiveprocesses affect the appropriateness of trust

Appropriate Trust CalibrationResolution and Specificity

Inappropriate reliance associated with mis-use and disuse depends in part on how welltrust matches the true capabilities of the auto-mation Supporting appropriate trust is criticalin avoiding misuse and disuse of automationjust as it is in facilitating effective interpersonalrelationships (Wicks Berman amp Jones 1999)

Calibration resolution and specificity of trustdescribe mismatches between trust and the capa-bilities of automation Calibration refers to thecorrespondence between a personrsquos trust inthe automation and the automationrsquos capabili-ties (Lee amp Moray 1994 Muir 1987) The defi-nitions of appropriate calibration of trust parallelthose of misuse and disuse in describing appro-priate reliance Overtrust is poor calibration inwhich trust exceeds system capabilities withdistrust trust falls short of the automationrsquos ca-pabilities In Figure 2 good calibration is rep-resented by the diagonal line where the levelof trust matches automation capabilities Abovethis line is overtrust and below it is distrust

Resolution refers to how precisely a judg-ment of trust differentiates levels of automation

Automation capability (trustworthiness)

Trust

Overtrust Trust exceeds system capabilities leading to misuse

Distrust Trust falls short of system capabilities leading to disuse

Calibrated trust Trust matches system capabilities leading to appropriate use

Poor resolution A large range of system capability maps onto a small range of trust

Good resolution A range of system capability maps onto the same range of trust

Figure 2 The relationship among calibration resolution and automation capability in defining appropriatetrust in automation Overtrust may lead to misuse and distrust may lead to disuse

56 Spring 2004 ndash Human Factors

capability (Cohen et al 1999) Figure 2 showsthat poor resolution occurs when a large rangeof automation capability maps onto a smallrange of trust With low resolution large chang-es in automation capability are reflected in smallchanges in trust Specificity refers to the degreeto which trust is associated with a particularcomponent or aspect of the trustee Functionalspecificity describes the differentiation of func-tions subfunctions and modes of automationWith high functional specificity a personrsquos trustreflects capabilities of specific subfunctionsand modes Low functional specificity meansthe personrsquos trust reflects the capabilities of theentire system

Specificity can also describe changes in trustas a function of the situation or over time Hightemporal specificity means that a personrsquos trustreflects moment-to-moment fluctuations inautomation capability whereas low temporalspecificity means that the trust reflects onlylong-term changes in automation capabilityAlthough temporal specificity implies a genericchange over time as the personrsquos trust adjuststo failures with the automation temporal speci-ficity also addresses adjustments that shouldoccur when the situation or context changesand affects the capability of the automationTemporal specificity reflects the sensitivity oftrust to changes in context that affect automa-tion capability High functional and temporalspecificity increase the likelihood that the levelof trust will match the capabilities of a particu-lar element of the automation at a particulartime Good calibration high resolution andhigh specificity of trust can mitigate misuse anddisuse of automation and so they can guide de-sign evaluation and training to enhance human-automation partnerships

Individual Organizational and CulturalContext

Trust does not develop in a vacuum but in-stead evolves in a complex individual culturaland organizational context The individual con-text includes individual differences such as thepropensity to trust These differences influencethe initial level of trust and influence how newinformation is interpreted The individual con-text also includes a personrsquos specific history of

interactions that have led to a particular level oftrust The organizational context can also havea strong influence on trust The organizationalcontext reflects the interactions between peoplethat inform them about the trustworthiness ofothers which can include reputation and gossipThe cultural context also influences trust throughsocial norms and expectations Understandingtrust requires a careful consideration of the indi-vidual organizational and cultural context

Systematic individual differences influencetrust between people Some people are moreinclined to trust than are others (Gaines et al1997 Stack 1978) Rotter (1967) defined trustas an enduring personality trait This concep-tion of trust follows a social learning theory ap-proach in which expectations for a particularsituation are determined by specific previousexperiences with situations that are perceivedto be similar (Rotter 1971) People develop be-liefs about others that are generalized and ex-trapolated from one interaction to another Inthis context trust is a generalized expectancythat is independent of specific experiences andbased on the generalization of a large numberof diverse experiences Individual differencesregarding trust have important implications forthe study of human-automation trust becausethey may influence reliance in ways that arenot directly related to the characteristics of theautomation

Substantial evidence demonstrates that thetendency to trust considered as a personalitytrait can be reliably measured and can influencebehavior in a systematic manner For exampleRotterrsquos (1980) Interpersonal Trust Scale reli-ably differentiates people on their propensity totrust others with an internal consistency of 76and a test-retest reliability of 56 with 7 monthsbetween tests Interestingly high-trust individ-uals are not more gullible than low-trust indi-viduals (Gurtman 1992) and the propensity totrust is not correlated with measures of intellect(Rotter 1980) however high-trust individualsare viewed as more trustworthy by others anddisplay more truthful behavior (Rotter1971) Infact people with a high propensity to trust pre-dicted othersrsquo trustworthiness better than thosewith a low propensity to trust (Kikuchi Wanta-nabe amp Yamasishi 1996) Likewise low- andhigh-trust individuals respond differently to

TRUST IN AUTOMATION 57

feedback regarding collaboratorsrsquo intentions andto situational risk (Kramer 1999)

These findings may explain why individualdifferences in the general tendency to trust auto-mation as measured by a complacency scale(Parasuraman Singh Molloy amp Parasuraman1992) are not clearly related to misuse of auto-mation For example Singh Molloy and Para-suraman (1993) found that high-complacencyindividuals actually detected more automationfailures in a constant reliability condition (534compared with 187 for low-complacencyindividuals) This unexpected result is similarto findings in studies of human trust in whichhighly trusting individuals were found to trustmore appropriately Several studies of trust inautomation show that for some people trustchanges substantially as the capability of theautomation changes and for other people trustchanges relatively little (Lee amp Moray1994 Ma-salonis 2000) One possible explanation thatmerits further investigation is that high-trustindividuals may be better able to adjust theirtrust to situations in which the automation ishighly capable as well as to situations in whichit is not

In contrast to the description of trust as astable personality trait most researchers havefocused on trust as an attitude (Jones amp George1998) In describing interpersonal relationshipstrust has been considered as a dynamic attitudethat evolves along with the developing rela-tionship (Rempel et al 1985) The influenceof individual differences regarding the predis-position to trust is most important when a situa-tion is ambiguous and generalized expectanciesdominate and it becomes less important as therelationship progresses (McKnight Cummingsamp Chervany 1998) Trust as an attitude is ahistory-dependent variable that depends on theprior behavior of the trusted person and the in-formation that is shared (Deutsch 1958) Theinitial level of trust is determined by past expe-riences in similar situations some of these ex-periences may even be indirect as in the caseof gossip

The organizational context influences howreputation gossip and formal and informalroles affect the trust of people who have neverhad any direct contact with the trustee (Burt ampKnez 1996 Kramer 1999) For example engi-

neers are trusted not because of the ability ofany specific person but because of the underly-ing education and regulatory structure that gov-erns people in the role of an engineer (Kramer1999) The degree to which the organizationalcontext affects the indirect development of trustdepends on the strength of links in the socialnetwork and on the trustorrsquos ability to establishlinks between the situations experienced by oth-ers and the situations he or she is confronting(Doney et al 1998) These findings suggestthat the organizational context and the indirectexposure that it facilitates can have an impor-tant influence on trust and reliance

Beyond the individual and organizationalcontext culture is another element of contextthat can influence trust and its development(Baba Falkenburg amp Hill 1996) Culture canbe defined as a set of social norms and expec-tations that reflect shared educational and lifeexperiences associated with national differencesor distinct cohorts of workers In particularJapanese citizens have been found to have a gen-erally low level of trust (Yamagishi amp Yama-gishi 1994) In Japan networks of mutuallycommitted relations play a more important rolein governing exchange relationships than theydo in the United States A cross-national ques-tionnaire that included 1136 Japanese and 501American respondents showed that the Ameri-can respondents were more trusting of otherpeople in general and considered reputation moreimportant In contrast the Japanese respondentsplaced a higher value on exchange relationshipsthat are based on personal relationships Oneexplanation for this is that the extreme socialstability of mutually committed relationships in Japan reduces uncertainty about transactionsand diminishes the role of trust (Doney et al1998)

More generally cultural differences associat-ed with power distance (eg dependence onauthority and respect for authoritarian norms)uncertainty avoidance and individualist andcollectivist attitudes can influence the develop-ment and role of trust (Doney et al 1998) Cul-tural variation can influence trust in an on-lineenvironment Karvonen (2001) compared trustin E-commerce service among consumers fromFinland Sweden and Iceland Although alltrusted a simple design there were substantial

58 Spring 2004 ndash Human Factors

differences in the initial trust in a complexdesign with Finnish customers being the mostwary and Icelandic customers the most trustingMore generally the level of social trust variessubstantially among countries and this variationaccounts for over 64 of the variance in the lev-el of Internet adoption (Huang Keser Lelandamp Shachat 2002) The culturally dependentnature of trust suggests that findings regardingtrust in automation may need to be verifiedwhen they are extrapolated from one culturalsetting to another

Cultural context can also describe systematicdifferences in groups of workers For exam-ple in the context of trust in automation Riley(1996) found that pilots who are accustomedto using automation trusted and relied on auto-mation more than did students who were notas familiar with automation Likewise in fieldstudies of operators adapting to computerizedmanufacturing systems Zuboff (1988) foundthat the culture associated with those operatorswho had been exposed to computer systems ledto greater trust and acceptance of automationThese results show that trust in automation liketrust in people is culturally influenced in thatit depends on the long-term experiences of agroup of people

Trust between people depends on the individ-ual organizational and cultural context Thiscontext influences trust because it affects initiallevels of trust and how people interpret infor-mation regarding the agent Some evidenceshows that these relationships may also hold fortrust in automation however very little researchhas investigated any of these factors We ad-dress the types of information that form thebasis of trust in the next section

Basis of Trust Levels of Detail and ofAttributional Abstraction

Trust is a multidimensional construct that isbased on trustee characteristics such as motivesintentions and actions (Bhattacharya Devinneyamp Pillutla 1998 Mayer et al 1995) The basisfor trust is the information that informs the per-son about the ability of the trustee to achievethe goals of the trustor Two critical elementsdefine the basis of trust The first is the focusof trust What is to be trusted The second isthe type of information that describes the entity

to be trusted What is the information support-ing trust This information guides expectationsregarding how well the trustee can achieve thetrustorrsquos goals The focus of trust can be de-scribed according to levels of detail and theinformation supporting trust can be describedaccording to three levels of attributional ab-straction

The focus of trust can be considered along adimension defined by the level of detail whichvaries from general trust of institutions andorganizations to trust in a specific agent Withautomation this might correspond to trust in anoverall system of automation as compared withtrust in a particular mode of an automatic con-troller Couch and Jones (1997) considered threelevels of trust ndash generalized network and part-ner trust ndash which are similar to the three levelsidentified by McKnight et al (1998) Generaltrust corresponds to trust in human nature andinstitutions (Barber 1983) Network trust refersto the cluster of acquaintances that constitutesa personrsquos social network whereas partner trustcorresponds to trust in specific persons (Rempelet al 1985)

There does not seem to be a reliable rela-tionship between global and specific trust sug-gesting that this dimension makes importantdistinctions regarding the evolution of trust(Couch amp Jones 1997) For example trust in asupervisor and trust in an organization are dis-tinct and depend on qualitatively different fac-tors (Tan amp Tan 2000) Trust in a supervisordepends on perceived ability integrity and be-nevolence whereas trust in the organizationdepends on more distal variables of perceivedorganizational support and justice General trustis frequently considered to be a personality traitwhereas specific trust is frequently viewed as anoutcome of a specific interaction (Couch ampJones 1997) Several studies show howeverthat general and specific trust are not necessarilytied to the ldquotrait and staterdquo distinctions and mayinstead simply refer to different levels of detail(Sitkin amp Roth 1993) Developing a high degreeof functional specificity in the trust of automa-tion may require specific information for eachlevel of detail of the automation

The basis of trust can be considered along adimension of attributional abstraction whichvaries from demonstrations of competence to

TRUST IN AUTOMATION 59

the intentions of the agent A recent review oftrust literature concluded that three general lev-els summarize the bases of trust ability integrityand benevolence (Mayer et al 1995) Ability isthe group of skills competencies and character-istics that enable the trustee to influence the do-main Integrity is the degree to which the trusteeadheres to a set of principles the trustor findsacceptable Benevolence is the extent to whichthe intents and motivations of the trustee arealigned with those of the trustor

In the context of interpersonal relationshipsRempel et al (1985) viewed trust as an evolvingphenomenon with the basis of trust changing asthe relationship progressed They argued thatpredictability the degree to which future be-havior can be anticipated (and which is similarto ability) forms the basis of trust early in arelationship This is followed by dependabilitywhich is the degree to which behavior is con-sistent and is similar to integrity As the rela-tionship matures the basis of trust ultimatelyshifts to faith which is a more general judgmentthat a person can be relied upon and is similarto benevolence A similar progression emergedin a study of operatorsrsquo adaptation to new tech-nology (Zuboff 1988) Trust in that context de-pended on trial-and-error experience followedby understanding of the technologyrsquos operationand finally faith Lee and Moray (1992) madesimilar distinctions in defining the factors thatinfluence trust in automation They identifiedperformance process and purpose as the gen-eral bases of trust

Performance refers to the current and histor-ical operation of the automation and includescharacteristics such as reliability predictabilityand ability Performance information describeswhat the automation does More specificallyperformance refers to the competency or exper-tise as demonstrated by its ability to achievethe operatorrsquos goals Because performance islinked to the ability to achieve specific goals itdemonstrates the task- and situation-dependentnature of trust This is similar to Sheridanrsquos(1992) concept of robustness as a basis for trustin human-automation relationships The operatorwill tend to trust automation that performs in amanner that reliably achieves his or her goals

Process is the degree to which the automa-tionrsquos algorithms are appropriate for the situation

and able to achieve the operatorrsquos goals Pro-cess information describes how the automationoperates In interpersonal relationships thiscorresponds to the consistency of actions associ-ated with adherence to a set of acceptable prin-ciples (Mayer et al 1995) Process as a basisfor trust reflects a shift away from focus on spe-cific behaviors and toward qualities and char-acteristics attributed to an agent With processtrust is in the agent and not in the specificactions of the agent Because of this the processbasis of trust relies on dispositional attributionsand inferences drawn from the performance ofthe agent In this sense process is similar todependability and integrity Openness the will-ingness to give and receive ideas is anotherelement of the process basis of trust Interesting-ly consistency and openness are more importantfor trust in peers than for trust in supervisors orsubordinates (Schindler amp Thomas 1993)

In the context of automation the processbasis of trust refers to the algorithms and oper-ations that govern the behavior of the auto-mation This concept is similar to Sheridanrsquos(1992) concept of understandability in human-automation relationships The operator willtend to trust the automation if its algorithmscan be understood and seem capable of achiev-ing the operatorrsquos goals in the current situation

Purpose refers to the degree to which theautomation is being used within the realm ofthe designerrsquos intent Purpose describes why theautomation was developed Purpose correspondsto faith and benevolence and reflects the percep-tion that the trustee has a positive orientationtoward the trustor With interpersonal relation-ships the perception of such a positive orienta-tion depends on the intentions and motives ofthe trustee This can take the form of abstractgeneralized value congruence (Sitkin amp Roth1993) which can be described as whether andto what extent the trustee has a motive to lie(Hovland Janis amp Kelly 1953) The purposebasis of trust reflects the attribution of thesecharacteristics to the automation Frequentlywhether or not this attribution takes place willdepend on whether the designerrsquos intent has beencommunicated to the operator If so the opera-tor will tend to trust automation to achieve thegoals it was designed to achieve

Table 1 builds on the work of Mayer et al

60 Spring 2004 ndash Human Factors

TABLE 1 Summary of the Dimensions That Describe the Basis of Trust

Study Basis of Trust Summary Dimension

Barber (1983) Competence PerformancePersistence ProcessFiduciary responsibility Purpose

Butler amp Cantrell (1984) Competence PerformanceIntegrity ProcessConsistency ProcessLoyalty PurposeOpenness Process

Cook amp Wall (1980) Ability PerformanceIntentions Purpose

Deutsch (1960) Ability PerformanceIntentions Purpose

Gabarro (1978) Integrity ProcessMotives PurposeOpenness ProcessDiscreetness ProcessFunctionalspecific competence PerformanceInterpersonal competence PerformanceBusiness sense PerformanceJudgment Performance

Hovland Janis amp Kelly (1953) Expertise PerformanceMotivation to lie Purpose

Jennings (1967) Loyalty PurposePredictability ProcessAccessibility ProcessAvailability Process

Kee amp Knox (1970) Competence PerformanceMotives Purpose

Mayer et al (1995) Ability PerformanceIntegrity ProcessBenevolence Purpose

Mishra (1996) Competency PerformanceReliability PerformanceOpenness ProcessConcern Purpose

Moorman et al (1993) Integrity ProcessWillingness to reduce uncertainty ProcessConfidentiality ProcessExpertise PerformanceTactfulness ProcessSincerity ProcessCongeniality PerformanceTimeliness Performance

Rempel et al (1985) Reliability PerformanceDependability ProcessFaith Purpose

Sitkin amp Roth (1993) Context-specific reliability PerformanceGeneralized value congruence Purpose

Zuboff (1988) Trial and error experience PerformanceUnderstanding ProcessLeap of faith Purpose

TRUST IN AUTOMATION 61

(1995) and shows how the many characteristicsof a trustee that affect trust can be summarizedby the three bases of performance process andpurpose As an example Mishra (1996) com-bined a literature review and interviews with33 managers to identify four dimensions oftrust These dimensions can be linked to thethree bases of trust competency (performance)reliability (performance) openness (process) andconcern (purpose) The dimensions of purposeprocess and performance provide a concise setof distinctions that describe the basis of trustacross a wide range of application domainsThese dimensions clearly identify three typesof goal-oriented information that contribute todeveloping an appropriate level of trust

Trust depends not only on the observationsassociated with these three dimensions but alsoon the inferences drawn among them Observa-tion of performance can support inferences re-garding internal mechanisms associated withthe dimension of process analysis of internalmechanisms can support inferences regardingthe designerrsquos intent associated with the dimen-sion of purpose Likewise the underlying pro-cess can be inferred from knowledge of thedesignerrsquos intent and performance can be esti-mated from an understanding of the underlyingprocess If these inferences support trust it fol-lows that a system design that facilitates themwould promote a more appropriate level of trustIf inferences are inconsistent with observationsthen trust will probably suffer because of apoor correspondence with the observed agentSimilarly if inferences between levels are incon-sistent trust will suffer because of poor coher-ence For example inconsistency between theintentions conveyed by the manager (purposebasis for trust) and the managerrsquos actions (perfor-mance basis of trust) have a particularly negativeeffect on trust (Gabarro 1978) Likewise the ef-fort to demonstrate reliability and encouragetrust by enforcing procedural approaches maynot succeed if value-oriented concerns (egpurpose) are ignored (Sitkin amp Roth 1993)

In conditions where trust is based on severalfactors it can be quite stable or robust in situ-ations where trust depends on a single basis itcan be quite volatile or fragile (McKnight et al1998) Trust that is based on an understandingof the motives of the agent will be less fragile

than trust based only on the reliability of theagentrsquos performance (Rempel et al 1985)These findings help explain why trust in auto-mation can sometimes appear quite robust andat other times quite fragile (Kantowitz Hanow-ski amp Kantowitz 1997b) Both imperfect co-herence between dimensions and imperfectcorrespondence with observations are likely toundermine trust and are critical considerationsfor encouraging appropriate levels of trust

The availability of information at the differ-ent levels of detail and attributional abstractionmay lead to high calibration high resolutionand great temporal and functional specificity oftrust Designing interfaces and training to pro-vide operators with information regarding thepurpose process and performance of automa-tion could enhance the appropriateness of trustHowever the mere availability of informationwill not ensure appropriate trust The informa-tion must be presented in a manner that is con-sistent with the cognitive processes underlyingthe development of trust as described in thefollowing section

Analytic Analogical and AffectiveProcesses Governing Trust

Information that forms the basis of trust canbe assimilated in three qualitatively differentways Some argue that trust is based on an ana-lytic process others argue that trust dependson an analogical process of assessing categorymembership Most importantly trust also seemsto depend on an affective response to the vio-lation or confirmation of implicit expectan-cies Figure 3 shows how these processes mayinteract to influence trust Ultimately trust is anaffective response but it is also influenced byanalytic and analogical processes The bold linesin Figure 3 show that the affective process hasa greater influence on the analytic process thanthe analytic has on the affective (LoewensteinWeber Hsee amp Welch 2001) The roles of ana-lytic analogical and affective processes dependon the evolution of the relationship betweenthe trustor and trustee the information avail-able to the trustor and the way the informationis displayed

A dominant approach to trust in the organi-zational sciences considers trust from a rationalchoice perspective (Hardin 1992) This view

62 Spring 2004 ndash Human Factors

suggests that trust depends on an evaluationmade under uncertainty in which decision mak-ers use their knowledge of the motivations andinterests of the other party to maximize gainsand minimize losses Individuals are assumed tomake choices based on a rationally derived as-sessment of costs and benefits (Lewicki amp Bun-ker 1996) Williamson (1993) argued that trustis primarily analytic in business and commercialrelations and that it represents one aspect ofrational risk analysis Trust accumulates and dis-sipates based on the effect of cumulative inter-actions with the other agent These interactionslead to an accumulation of knowledge that es-tablishes trust through an expected value cal-culation in which the probabilities of variousoutcomes are estimated with increasing experi-ence (Holmes 1991) Trust can also depend onan analysis of how organizational and individualconstraints affect performance For exampletrust can be considered in terms of a rationalargument in which grounds for various conclu-sions are developed and defended according tothe evidence (Cohen 2000)

These approaches provide a useful evaluationof the normative level of trust but they may notdescribe how trust actually develops Analyticapproaches to trust probably overstate cognitivecapacities and understate the influence of affectand role of strategies to accommodate the lim-its of bounded rationality similar to normative

decision-making theories (Janis amp Mann 1977Simon 1957) Descriptive accounts of decisionmaking show that expert decision makers sel-dom engage in conscious calculations or in anexhaustive comparison of alternatives (Cross-man 1974 Klein 1989) The analytic processis similar to knowledge-based performance asdescribed by Rasmussen (1983) in which infor-mation is processed and plans are formulatedand evaluated using a function-based mentalmodel of the system Knowledge-based or ana-lytic processes are probably complemented byless cognitively demanding processes

A less cognitively demanding process is fortrust to develop according to analogical judg-ments that link levels of trust to characteristicsof the agent and environmental context Thesecategories develop through direct observationof the trusted agent intermediaries that conveytheir observations and presumptions based onstandards category membership and procedures(Kramer 1999)

Explicit and implicit rules frequently guideindividual and collective collaborative behavior(Mishra 1996) Sitkin and Roth (1993) illus-trated the importance of rules in their distinctionbetween trust as an institutional arrangementand interpersonal trust Institutional trust de-pends on contracts legal procedures and for-mal rules whereas interpersonal trust is basedon shared values The application of rules in

Figure 3 The interplay among the analytic analogical and affective process underlying trust with the analyticand analogical processes both contributing to the affective process and the affective process also having adominant influence on the analytic and analogical processes

TRUST IN AUTOMATION 63

institution-based trust reflects guarantees andsafety nets based on the organizational con-straints of regulations and legal recourses thatconstrain behavior and make it more possibleto trust (McKnight et al 1998) However whenfundamental values which frequently supportinterpersonal trust are violated legalistic ap-proaches and applications of rules to restoretrust are ineffective and can further underminetrust (Sitkin amp Roth 1993) When rules areconsistent with trustworthy behavior they canincrease peoplersquos expectation of satisfactoryperformance during times of normal operationby pairing situations with rule-based expecta-tions but rules can have a negative effect whensituations diverge from normal operation

Analogical trust can also depend on interme-diaries who can convey information to supportjudgments of trust (Burt amp Knez 1996) suchas reputations and gossip that enable individualsto develop trust without any direct contact Inthe context of trust in automation responsetimes to warnings tend to increase when falsealarms occur This effect was counteracted bygossip that suggested that the rate of false alarmswas lower than it actually was (Bliss Dunn ampFuller 1995) Similarly trust can be based onpresumptions of the trustworthiness of a cate-gory or role of the person rather than on theactual performance (Kramer Brewer amp Hanna1996) If trust is primarily based on rules thatcharacterize performance during normal situa-tions then abnormal situations or erroneouscategory membership might lead to the collapseof trust because the assurances associated withnormal or customary operation are no longervalid Because of this category-based trust canbe fragile particularly when abnormal situationsdisrupt otherwise stable roles For example whenroles were defined by a binding contract trust de-clined substantially when the contracts wereremoved (Malhotra amp Murnighan 2002)

Similar effects may occur with trust in auto-mation Specifically Miller (2002) suggested thatcomputer etiquette may have an important influ-ence on human-computer interaction Etiquettemay influence trust because category member-ship associated with adherence to a particularetiquette helps people to infer how the automa-tion will perform Intentionally developing com-puter etiquette could promote appropriate trust

but it could lead to inappropriate trust if peopleinfer inappropriate category memberships anddevelop distorted expectations regarding thecapability of the automation

These studies suggest that trust in automa-tion may depend on category judgments basedon a variety of sources that include direct andindirect interaction with the automation Be-cause these analogical judgments emerge fromcomplex interactions between people proce-dures and prolonged experience data from lab-oratory experiments may need to be carefullyassessed to determine how well they generalizeto actual operational settings In this way theanalogical process governing trust correspondsvery closely to Rasmussenrsquos (1983) concept ofrule-based behavior in which condition-actionpairings or procedures guide behavior

Analytic and analogical processes do not ac-count for the affective aspects of trust whichrepresent the core influence of trust on behav-ior (Kramer 1999) Emotional responses arecritical because people not only think abouttrust they also feel it (Fine amp Holyfield 1996)Emotions fluctuate over time according to theperformance of the trustee and they can signalinstances where expectations do not conform tothe ongoing experience As trust is betrayedemotions provide signals concerning the chang-ing nature of the situation (Jones amp George1998)

Berscheid (1994) used the concept of auto-matic processing to describe how behaviors thatare relevant to frequently accessed trait con-structs such as trust are likely to be perceivedunder attentional overload situations Becauseof this people process observations of new be-havior according to old constructs withoutalways being aware of doing so Automaticprocessing plays a substantial role in attribu-tional activities with many aspects of causalreasoning occurring outside conscious aware-ness In this way emotions bridge the gaps inrationality and enable people to focus theirlimited attentional capacity (Johnson-Laird ampOately 1992 Loewenstein et al 2001 Simon1967) Because the cognitive complexity ofrelationships can exceed a personrsquos capacity toform a complete mental model people cannotperfectly predict behavior and emotions serveto redistribute cognitive resources and manage

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 5: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

54 Spring 2004 ndash Human Factors

behavior Other psychological system and en-vironmental constraints intervene such as whenoperators do not have enough time to engagethe automation even though they trust it andintend to use it or when the effort to engagethe automation outweighs its benefits (Kirlik1993) Trust stands between beliefs about thecharacteristics of the automation and the inten-tion to rely on the automation

Many definitions of trust also indicate theimportance of the goal-oriented nature of trustAlthough many definitions do not identify thisaspect of trust explicitly several mention theability of the trustee to perform an important ac-tion and the expectation of the trustor that thetrustee will perform as expected or can be reliedupon (Gurtman 1992 Johns 1996 Mayer etal 1995) These definitions describe the basisof trust in terms of the performance of an agentthe trustee who furthers the goals of an indi-vidual the trustor In this way trust describes a

relationship that depends on the characteristicsof the trustee the trustor and the goal-relatedcontext of the interaction Trust is not a consid-eration in situations where the trustor does notdepend on the trustee to perform some functionrelated to the trustorrsquos goals

A simple definition of trust consistent withthese considerations is the attitude that anagent will help achieve an individualrsquos goals ina situation characterized by uncertainty andvulnerability This basic definition must beelaborated to consider the appropriateness oftrust the influence of context the goal-relatedcharacteristics of the agent and the cognitiveprocesses that govern the development anderosion of trust Figure 1 shows how these fac-tors interact in a dynamic process of relianceand the following sections elaborate on variouscomponents of this conceptual model

First we will consider the appropriatenessof trust In Figure 1 appropriateness is shown

Figure1The interaction of context agent characteristics and cognitive properties with the appropriateness of trust

TRUST IN AUTOMATION 55

as the relationship between the true capabili-ties of the agent and the level of trust Secondwe will consider the context defined by thecharacteristics of the individual organizationand culture Figure 1 shows this as a collectionof factors at the top of the diagram each affect-ing a different element of the belief-attitude-intention-behavior sequence associated withreliance on an agent Third we will describe thebasis of trust This defines the different types ofinformation needed to maintain an appropriatelevel of trust which the bottom of Figure 1shows in terms of the purpose process andperformance dimensions that describe the goal-oriented characteristics of the agent Finallyregarding the cognitive process governingtrust trust depends on the interplay among theanalytic analogical and affective processes Ineach of these sections we will cite literatureregarding the organizational sociological inter-personal psychological and neurological per-spectives of human-human trust and then drawparallels with human-automation trust Theunderstanding of trust in automation can bene-fit from a multidisciplinary consideration ofhow context agent characteristics and cognitiveprocesses affect the appropriateness of trust

Appropriate Trust CalibrationResolution and Specificity

Inappropriate reliance associated with mis-use and disuse depends in part on how welltrust matches the true capabilities of the auto-mation Supporting appropriate trust is criticalin avoiding misuse and disuse of automationjust as it is in facilitating effective interpersonalrelationships (Wicks Berman amp Jones 1999)

Calibration resolution and specificity of trustdescribe mismatches between trust and the capa-bilities of automation Calibration refers to thecorrespondence between a personrsquos trust inthe automation and the automationrsquos capabili-ties (Lee amp Moray 1994 Muir 1987) The defi-nitions of appropriate calibration of trust parallelthose of misuse and disuse in describing appro-priate reliance Overtrust is poor calibration inwhich trust exceeds system capabilities withdistrust trust falls short of the automationrsquos ca-pabilities In Figure 2 good calibration is rep-resented by the diagonal line where the levelof trust matches automation capabilities Abovethis line is overtrust and below it is distrust

Resolution refers to how precisely a judg-ment of trust differentiates levels of automation

Automation capability (trustworthiness)

Trust

Overtrust Trust exceeds system capabilities leading to misuse

Distrust Trust falls short of system capabilities leading to disuse

Calibrated trust Trust matches system capabilities leading to appropriate use

Poor resolution A large range of system capability maps onto a small range of trust

Good resolution A range of system capability maps onto the same range of trust

Figure 2 The relationship among calibration resolution and automation capability in defining appropriatetrust in automation Overtrust may lead to misuse and distrust may lead to disuse

56 Spring 2004 ndash Human Factors

capability (Cohen et al 1999) Figure 2 showsthat poor resolution occurs when a large rangeof automation capability maps onto a smallrange of trust With low resolution large chang-es in automation capability are reflected in smallchanges in trust Specificity refers to the degreeto which trust is associated with a particularcomponent or aspect of the trustee Functionalspecificity describes the differentiation of func-tions subfunctions and modes of automationWith high functional specificity a personrsquos trustreflects capabilities of specific subfunctionsand modes Low functional specificity meansthe personrsquos trust reflects the capabilities of theentire system

Specificity can also describe changes in trustas a function of the situation or over time Hightemporal specificity means that a personrsquos trustreflects moment-to-moment fluctuations inautomation capability whereas low temporalspecificity means that the trust reflects onlylong-term changes in automation capabilityAlthough temporal specificity implies a genericchange over time as the personrsquos trust adjuststo failures with the automation temporal speci-ficity also addresses adjustments that shouldoccur when the situation or context changesand affects the capability of the automationTemporal specificity reflects the sensitivity oftrust to changes in context that affect automa-tion capability High functional and temporalspecificity increase the likelihood that the levelof trust will match the capabilities of a particu-lar element of the automation at a particulartime Good calibration high resolution andhigh specificity of trust can mitigate misuse anddisuse of automation and so they can guide de-sign evaluation and training to enhance human-automation partnerships

Individual Organizational and CulturalContext

Trust does not develop in a vacuum but in-stead evolves in a complex individual culturaland organizational context The individual con-text includes individual differences such as thepropensity to trust These differences influencethe initial level of trust and influence how newinformation is interpreted The individual con-text also includes a personrsquos specific history of

interactions that have led to a particular level oftrust The organizational context can also havea strong influence on trust The organizationalcontext reflects the interactions between peoplethat inform them about the trustworthiness ofothers which can include reputation and gossipThe cultural context also influences trust throughsocial norms and expectations Understandingtrust requires a careful consideration of the indi-vidual organizational and cultural context

Systematic individual differences influencetrust between people Some people are moreinclined to trust than are others (Gaines et al1997 Stack 1978) Rotter (1967) defined trustas an enduring personality trait This concep-tion of trust follows a social learning theory ap-proach in which expectations for a particularsituation are determined by specific previousexperiences with situations that are perceivedto be similar (Rotter 1971) People develop be-liefs about others that are generalized and ex-trapolated from one interaction to another Inthis context trust is a generalized expectancythat is independent of specific experiences andbased on the generalization of a large numberof diverse experiences Individual differencesregarding trust have important implications forthe study of human-automation trust becausethey may influence reliance in ways that arenot directly related to the characteristics of theautomation

Substantial evidence demonstrates that thetendency to trust considered as a personalitytrait can be reliably measured and can influencebehavior in a systematic manner For exampleRotterrsquos (1980) Interpersonal Trust Scale reli-ably differentiates people on their propensity totrust others with an internal consistency of 76and a test-retest reliability of 56 with 7 monthsbetween tests Interestingly high-trust individ-uals are not more gullible than low-trust indi-viduals (Gurtman 1992) and the propensity totrust is not correlated with measures of intellect(Rotter 1980) however high-trust individualsare viewed as more trustworthy by others anddisplay more truthful behavior (Rotter1971) Infact people with a high propensity to trust pre-dicted othersrsquo trustworthiness better than thosewith a low propensity to trust (Kikuchi Wanta-nabe amp Yamasishi 1996) Likewise low- andhigh-trust individuals respond differently to

TRUST IN AUTOMATION 57

feedback regarding collaboratorsrsquo intentions andto situational risk (Kramer 1999)

These findings may explain why individualdifferences in the general tendency to trust auto-mation as measured by a complacency scale(Parasuraman Singh Molloy amp Parasuraman1992) are not clearly related to misuse of auto-mation For example Singh Molloy and Para-suraman (1993) found that high-complacencyindividuals actually detected more automationfailures in a constant reliability condition (534compared with 187 for low-complacencyindividuals) This unexpected result is similarto findings in studies of human trust in whichhighly trusting individuals were found to trustmore appropriately Several studies of trust inautomation show that for some people trustchanges substantially as the capability of theautomation changes and for other people trustchanges relatively little (Lee amp Moray1994 Ma-salonis 2000) One possible explanation thatmerits further investigation is that high-trustindividuals may be better able to adjust theirtrust to situations in which the automation ishighly capable as well as to situations in whichit is not

In contrast to the description of trust as astable personality trait most researchers havefocused on trust as an attitude (Jones amp George1998) In describing interpersonal relationshipstrust has been considered as a dynamic attitudethat evolves along with the developing rela-tionship (Rempel et al 1985) The influenceof individual differences regarding the predis-position to trust is most important when a situa-tion is ambiguous and generalized expectanciesdominate and it becomes less important as therelationship progresses (McKnight Cummingsamp Chervany 1998) Trust as an attitude is ahistory-dependent variable that depends on theprior behavior of the trusted person and the in-formation that is shared (Deutsch 1958) Theinitial level of trust is determined by past expe-riences in similar situations some of these ex-periences may even be indirect as in the caseof gossip

The organizational context influences howreputation gossip and formal and informalroles affect the trust of people who have neverhad any direct contact with the trustee (Burt ampKnez 1996 Kramer 1999) For example engi-

neers are trusted not because of the ability ofany specific person but because of the underly-ing education and regulatory structure that gov-erns people in the role of an engineer (Kramer1999) The degree to which the organizationalcontext affects the indirect development of trustdepends on the strength of links in the socialnetwork and on the trustorrsquos ability to establishlinks between the situations experienced by oth-ers and the situations he or she is confronting(Doney et al 1998) These findings suggestthat the organizational context and the indirectexposure that it facilitates can have an impor-tant influence on trust and reliance

Beyond the individual and organizationalcontext culture is another element of contextthat can influence trust and its development(Baba Falkenburg amp Hill 1996) Culture canbe defined as a set of social norms and expec-tations that reflect shared educational and lifeexperiences associated with national differencesor distinct cohorts of workers In particularJapanese citizens have been found to have a gen-erally low level of trust (Yamagishi amp Yama-gishi 1994) In Japan networks of mutuallycommitted relations play a more important rolein governing exchange relationships than theydo in the United States A cross-national ques-tionnaire that included 1136 Japanese and 501American respondents showed that the Ameri-can respondents were more trusting of otherpeople in general and considered reputation moreimportant In contrast the Japanese respondentsplaced a higher value on exchange relationshipsthat are based on personal relationships Oneexplanation for this is that the extreme socialstability of mutually committed relationships in Japan reduces uncertainty about transactionsand diminishes the role of trust (Doney et al1998)

More generally cultural differences associat-ed with power distance (eg dependence onauthority and respect for authoritarian norms)uncertainty avoidance and individualist andcollectivist attitudes can influence the develop-ment and role of trust (Doney et al 1998) Cul-tural variation can influence trust in an on-lineenvironment Karvonen (2001) compared trustin E-commerce service among consumers fromFinland Sweden and Iceland Although alltrusted a simple design there were substantial

58 Spring 2004 ndash Human Factors

differences in the initial trust in a complexdesign with Finnish customers being the mostwary and Icelandic customers the most trustingMore generally the level of social trust variessubstantially among countries and this variationaccounts for over 64 of the variance in the lev-el of Internet adoption (Huang Keser Lelandamp Shachat 2002) The culturally dependentnature of trust suggests that findings regardingtrust in automation may need to be verifiedwhen they are extrapolated from one culturalsetting to another

Cultural context can also describe systematicdifferences in groups of workers For exam-ple in the context of trust in automation Riley(1996) found that pilots who are accustomedto using automation trusted and relied on auto-mation more than did students who were notas familiar with automation Likewise in fieldstudies of operators adapting to computerizedmanufacturing systems Zuboff (1988) foundthat the culture associated with those operatorswho had been exposed to computer systems ledto greater trust and acceptance of automationThese results show that trust in automation liketrust in people is culturally influenced in thatit depends on the long-term experiences of agroup of people

Trust between people depends on the individ-ual organizational and cultural context Thiscontext influences trust because it affects initiallevels of trust and how people interpret infor-mation regarding the agent Some evidenceshows that these relationships may also hold fortrust in automation however very little researchhas investigated any of these factors We ad-dress the types of information that form thebasis of trust in the next section

Basis of Trust Levels of Detail and ofAttributional Abstraction

Trust is a multidimensional construct that isbased on trustee characteristics such as motivesintentions and actions (Bhattacharya Devinneyamp Pillutla 1998 Mayer et al 1995) The basisfor trust is the information that informs the per-son about the ability of the trustee to achievethe goals of the trustor Two critical elementsdefine the basis of trust The first is the focusof trust What is to be trusted The second isthe type of information that describes the entity

to be trusted What is the information support-ing trust This information guides expectationsregarding how well the trustee can achieve thetrustorrsquos goals The focus of trust can be de-scribed according to levels of detail and theinformation supporting trust can be describedaccording to three levels of attributional ab-straction

The focus of trust can be considered along adimension defined by the level of detail whichvaries from general trust of institutions andorganizations to trust in a specific agent Withautomation this might correspond to trust in anoverall system of automation as compared withtrust in a particular mode of an automatic con-troller Couch and Jones (1997) considered threelevels of trust ndash generalized network and part-ner trust ndash which are similar to the three levelsidentified by McKnight et al (1998) Generaltrust corresponds to trust in human nature andinstitutions (Barber 1983) Network trust refersto the cluster of acquaintances that constitutesa personrsquos social network whereas partner trustcorresponds to trust in specific persons (Rempelet al 1985)

There does not seem to be a reliable rela-tionship between global and specific trust sug-gesting that this dimension makes importantdistinctions regarding the evolution of trust(Couch amp Jones 1997) For example trust in asupervisor and trust in an organization are dis-tinct and depend on qualitatively different fac-tors (Tan amp Tan 2000) Trust in a supervisordepends on perceived ability integrity and be-nevolence whereas trust in the organizationdepends on more distal variables of perceivedorganizational support and justice General trustis frequently considered to be a personality traitwhereas specific trust is frequently viewed as anoutcome of a specific interaction (Couch ampJones 1997) Several studies show howeverthat general and specific trust are not necessarilytied to the ldquotrait and staterdquo distinctions and mayinstead simply refer to different levels of detail(Sitkin amp Roth 1993) Developing a high degreeof functional specificity in the trust of automa-tion may require specific information for eachlevel of detail of the automation

The basis of trust can be considered along adimension of attributional abstraction whichvaries from demonstrations of competence to

TRUST IN AUTOMATION 59

the intentions of the agent A recent review oftrust literature concluded that three general lev-els summarize the bases of trust ability integrityand benevolence (Mayer et al 1995) Ability isthe group of skills competencies and character-istics that enable the trustee to influence the do-main Integrity is the degree to which the trusteeadheres to a set of principles the trustor findsacceptable Benevolence is the extent to whichthe intents and motivations of the trustee arealigned with those of the trustor

In the context of interpersonal relationshipsRempel et al (1985) viewed trust as an evolvingphenomenon with the basis of trust changing asthe relationship progressed They argued thatpredictability the degree to which future be-havior can be anticipated (and which is similarto ability) forms the basis of trust early in arelationship This is followed by dependabilitywhich is the degree to which behavior is con-sistent and is similar to integrity As the rela-tionship matures the basis of trust ultimatelyshifts to faith which is a more general judgmentthat a person can be relied upon and is similarto benevolence A similar progression emergedin a study of operatorsrsquo adaptation to new tech-nology (Zuboff 1988) Trust in that context de-pended on trial-and-error experience followedby understanding of the technologyrsquos operationand finally faith Lee and Moray (1992) madesimilar distinctions in defining the factors thatinfluence trust in automation They identifiedperformance process and purpose as the gen-eral bases of trust

Performance refers to the current and histor-ical operation of the automation and includescharacteristics such as reliability predictabilityand ability Performance information describeswhat the automation does More specificallyperformance refers to the competency or exper-tise as demonstrated by its ability to achievethe operatorrsquos goals Because performance islinked to the ability to achieve specific goals itdemonstrates the task- and situation-dependentnature of trust This is similar to Sheridanrsquos(1992) concept of robustness as a basis for trustin human-automation relationships The operatorwill tend to trust automation that performs in amanner that reliably achieves his or her goals

Process is the degree to which the automa-tionrsquos algorithms are appropriate for the situation

and able to achieve the operatorrsquos goals Pro-cess information describes how the automationoperates In interpersonal relationships thiscorresponds to the consistency of actions associ-ated with adherence to a set of acceptable prin-ciples (Mayer et al 1995) Process as a basisfor trust reflects a shift away from focus on spe-cific behaviors and toward qualities and char-acteristics attributed to an agent With processtrust is in the agent and not in the specificactions of the agent Because of this the processbasis of trust relies on dispositional attributionsand inferences drawn from the performance ofthe agent In this sense process is similar todependability and integrity Openness the will-ingness to give and receive ideas is anotherelement of the process basis of trust Interesting-ly consistency and openness are more importantfor trust in peers than for trust in supervisors orsubordinates (Schindler amp Thomas 1993)

In the context of automation the processbasis of trust refers to the algorithms and oper-ations that govern the behavior of the auto-mation This concept is similar to Sheridanrsquos(1992) concept of understandability in human-automation relationships The operator willtend to trust the automation if its algorithmscan be understood and seem capable of achiev-ing the operatorrsquos goals in the current situation

Purpose refers to the degree to which theautomation is being used within the realm ofthe designerrsquos intent Purpose describes why theautomation was developed Purpose correspondsto faith and benevolence and reflects the percep-tion that the trustee has a positive orientationtoward the trustor With interpersonal relation-ships the perception of such a positive orienta-tion depends on the intentions and motives ofthe trustee This can take the form of abstractgeneralized value congruence (Sitkin amp Roth1993) which can be described as whether andto what extent the trustee has a motive to lie(Hovland Janis amp Kelly 1953) The purposebasis of trust reflects the attribution of thesecharacteristics to the automation Frequentlywhether or not this attribution takes place willdepend on whether the designerrsquos intent has beencommunicated to the operator If so the opera-tor will tend to trust automation to achieve thegoals it was designed to achieve

Table 1 builds on the work of Mayer et al

60 Spring 2004 ndash Human Factors

TABLE 1 Summary of the Dimensions That Describe the Basis of Trust

Study Basis of Trust Summary Dimension

Barber (1983) Competence PerformancePersistence ProcessFiduciary responsibility Purpose

Butler amp Cantrell (1984) Competence PerformanceIntegrity ProcessConsistency ProcessLoyalty PurposeOpenness Process

Cook amp Wall (1980) Ability PerformanceIntentions Purpose

Deutsch (1960) Ability PerformanceIntentions Purpose

Gabarro (1978) Integrity ProcessMotives PurposeOpenness ProcessDiscreetness ProcessFunctionalspecific competence PerformanceInterpersonal competence PerformanceBusiness sense PerformanceJudgment Performance

Hovland Janis amp Kelly (1953) Expertise PerformanceMotivation to lie Purpose

Jennings (1967) Loyalty PurposePredictability ProcessAccessibility ProcessAvailability Process

Kee amp Knox (1970) Competence PerformanceMotives Purpose

Mayer et al (1995) Ability PerformanceIntegrity ProcessBenevolence Purpose

Mishra (1996) Competency PerformanceReliability PerformanceOpenness ProcessConcern Purpose

Moorman et al (1993) Integrity ProcessWillingness to reduce uncertainty ProcessConfidentiality ProcessExpertise PerformanceTactfulness ProcessSincerity ProcessCongeniality PerformanceTimeliness Performance

Rempel et al (1985) Reliability PerformanceDependability ProcessFaith Purpose

Sitkin amp Roth (1993) Context-specific reliability PerformanceGeneralized value congruence Purpose

Zuboff (1988) Trial and error experience PerformanceUnderstanding ProcessLeap of faith Purpose

TRUST IN AUTOMATION 61

(1995) and shows how the many characteristicsof a trustee that affect trust can be summarizedby the three bases of performance process andpurpose As an example Mishra (1996) com-bined a literature review and interviews with33 managers to identify four dimensions oftrust These dimensions can be linked to thethree bases of trust competency (performance)reliability (performance) openness (process) andconcern (purpose) The dimensions of purposeprocess and performance provide a concise setof distinctions that describe the basis of trustacross a wide range of application domainsThese dimensions clearly identify three typesof goal-oriented information that contribute todeveloping an appropriate level of trust

Trust depends not only on the observationsassociated with these three dimensions but alsoon the inferences drawn among them Observa-tion of performance can support inferences re-garding internal mechanisms associated withthe dimension of process analysis of internalmechanisms can support inferences regardingthe designerrsquos intent associated with the dimen-sion of purpose Likewise the underlying pro-cess can be inferred from knowledge of thedesignerrsquos intent and performance can be esti-mated from an understanding of the underlyingprocess If these inferences support trust it fol-lows that a system design that facilitates themwould promote a more appropriate level of trustIf inferences are inconsistent with observationsthen trust will probably suffer because of apoor correspondence with the observed agentSimilarly if inferences between levels are incon-sistent trust will suffer because of poor coher-ence For example inconsistency between theintentions conveyed by the manager (purposebasis for trust) and the managerrsquos actions (perfor-mance basis of trust) have a particularly negativeeffect on trust (Gabarro 1978) Likewise the ef-fort to demonstrate reliability and encouragetrust by enforcing procedural approaches maynot succeed if value-oriented concerns (egpurpose) are ignored (Sitkin amp Roth 1993)

In conditions where trust is based on severalfactors it can be quite stable or robust in situ-ations where trust depends on a single basis itcan be quite volatile or fragile (McKnight et al1998) Trust that is based on an understandingof the motives of the agent will be less fragile

than trust based only on the reliability of theagentrsquos performance (Rempel et al 1985)These findings help explain why trust in auto-mation can sometimes appear quite robust andat other times quite fragile (Kantowitz Hanow-ski amp Kantowitz 1997b) Both imperfect co-herence between dimensions and imperfectcorrespondence with observations are likely toundermine trust and are critical considerationsfor encouraging appropriate levels of trust

The availability of information at the differ-ent levels of detail and attributional abstractionmay lead to high calibration high resolutionand great temporal and functional specificity oftrust Designing interfaces and training to pro-vide operators with information regarding thepurpose process and performance of automa-tion could enhance the appropriateness of trustHowever the mere availability of informationwill not ensure appropriate trust The informa-tion must be presented in a manner that is con-sistent with the cognitive processes underlyingthe development of trust as described in thefollowing section

Analytic Analogical and AffectiveProcesses Governing Trust

Information that forms the basis of trust canbe assimilated in three qualitatively differentways Some argue that trust is based on an ana-lytic process others argue that trust dependson an analogical process of assessing categorymembership Most importantly trust also seemsto depend on an affective response to the vio-lation or confirmation of implicit expectan-cies Figure 3 shows how these processes mayinteract to influence trust Ultimately trust is anaffective response but it is also influenced byanalytic and analogical processes The bold linesin Figure 3 show that the affective process hasa greater influence on the analytic process thanthe analytic has on the affective (LoewensteinWeber Hsee amp Welch 2001) The roles of ana-lytic analogical and affective processes dependon the evolution of the relationship betweenthe trustor and trustee the information avail-able to the trustor and the way the informationis displayed

A dominant approach to trust in the organi-zational sciences considers trust from a rationalchoice perspective (Hardin 1992) This view

62 Spring 2004 ndash Human Factors

suggests that trust depends on an evaluationmade under uncertainty in which decision mak-ers use their knowledge of the motivations andinterests of the other party to maximize gainsand minimize losses Individuals are assumed tomake choices based on a rationally derived as-sessment of costs and benefits (Lewicki amp Bun-ker 1996) Williamson (1993) argued that trustis primarily analytic in business and commercialrelations and that it represents one aspect ofrational risk analysis Trust accumulates and dis-sipates based on the effect of cumulative inter-actions with the other agent These interactionslead to an accumulation of knowledge that es-tablishes trust through an expected value cal-culation in which the probabilities of variousoutcomes are estimated with increasing experi-ence (Holmes 1991) Trust can also depend onan analysis of how organizational and individualconstraints affect performance For exampletrust can be considered in terms of a rationalargument in which grounds for various conclu-sions are developed and defended according tothe evidence (Cohen 2000)

These approaches provide a useful evaluationof the normative level of trust but they may notdescribe how trust actually develops Analyticapproaches to trust probably overstate cognitivecapacities and understate the influence of affectand role of strategies to accommodate the lim-its of bounded rationality similar to normative

decision-making theories (Janis amp Mann 1977Simon 1957) Descriptive accounts of decisionmaking show that expert decision makers sel-dom engage in conscious calculations or in anexhaustive comparison of alternatives (Cross-man 1974 Klein 1989) The analytic processis similar to knowledge-based performance asdescribed by Rasmussen (1983) in which infor-mation is processed and plans are formulatedand evaluated using a function-based mentalmodel of the system Knowledge-based or ana-lytic processes are probably complemented byless cognitively demanding processes

A less cognitively demanding process is fortrust to develop according to analogical judg-ments that link levels of trust to characteristicsof the agent and environmental context Thesecategories develop through direct observationof the trusted agent intermediaries that conveytheir observations and presumptions based onstandards category membership and procedures(Kramer 1999)

Explicit and implicit rules frequently guideindividual and collective collaborative behavior(Mishra 1996) Sitkin and Roth (1993) illus-trated the importance of rules in their distinctionbetween trust as an institutional arrangementand interpersonal trust Institutional trust de-pends on contracts legal procedures and for-mal rules whereas interpersonal trust is basedon shared values The application of rules in

Figure 3 The interplay among the analytic analogical and affective process underlying trust with the analyticand analogical processes both contributing to the affective process and the affective process also having adominant influence on the analytic and analogical processes

TRUST IN AUTOMATION 63

institution-based trust reflects guarantees andsafety nets based on the organizational con-straints of regulations and legal recourses thatconstrain behavior and make it more possibleto trust (McKnight et al 1998) However whenfundamental values which frequently supportinterpersonal trust are violated legalistic ap-proaches and applications of rules to restoretrust are ineffective and can further underminetrust (Sitkin amp Roth 1993) When rules areconsistent with trustworthy behavior they canincrease peoplersquos expectation of satisfactoryperformance during times of normal operationby pairing situations with rule-based expecta-tions but rules can have a negative effect whensituations diverge from normal operation

Analogical trust can also depend on interme-diaries who can convey information to supportjudgments of trust (Burt amp Knez 1996) suchas reputations and gossip that enable individualsto develop trust without any direct contact Inthe context of trust in automation responsetimes to warnings tend to increase when falsealarms occur This effect was counteracted bygossip that suggested that the rate of false alarmswas lower than it actually was (Bliss Dunn ampFuller 1995) Similarly trust can be based onpresumptions of the trustworthiness of a cate-gory or role of the person rather than on theactual performance (Kramer Brewer amp Hanna1996) If trust is primarily based on rules thatcharacterize performance during normal situa-tions then abnormal situations or erroneouscategory membership might lead to the collapseof trust because the assurances associated withnormal or customary operation are no longervalid Because of this category-based trust canbe fragile particularly when abnormal situationsdisrupt otherwise stable roles For example whenroles were defined by a binding contract trust de-clined substantially when the contracts wereremoved (Malhotra amp Murnighan 2002)

Similar effects may occur with trust in auto-mation Specifically Miller (2002) suggested thatcomputer etiquette may have an important influ-ence on human-computer interaction Etiquettemay influence trust because category member-ship associated with adherence to a particularetiquette helps people to infer how the automa-tion will perform Intentionally developing com-puter etiquette could promote appropriate trust

but it could lead to inappropriate trust if peopleinfer inappropriate category memberships anddevelop distorted expectations regarding thecapability of the automation

These studies suggest that trust in automa-tion may depend on category judgments basedon a variety of sources that include direct andindirect interaction with the automation Be-cause these analogical judgments emerge fromcomplex interactions between people proce-dures and prolonged experience data from lab-oratory experiments may need to be carefullyassessed to determine how well they generalizeto actual operational settings In this way theanalogical process governing trust correspondsvery closely to Rasmussenrsquos (1983) concept ofrule-based behavior in which condition-actionpairings or procedures guide behavior

Analytic and analogical processes do not ac-count for the affective aspects of trust whichrepresent the core influence of trust on behav-ior (Kramer 1999) Emotional responses arecritical because people not only think abouttrust they also feel it (Fine amp Holyfield 1996)Emotions fluctuate over time according to theperformance of the trustee and they can signalinstances where expectations do not conform tothe ongoing experience As trust is betrayedemotions provide signals concerning the chang-ing nature of the situation (Jones amp George1998)

Berscheid (1994) used the concept of auto-matic processing to describe how behaviors thatare relevant to frequently accessed trait con-structs such as trust are likely to be perceivedunder attentional overload situations Becauseof this people process observations of new be-havior according to old constructs withoutalways being aware of doing so Automaticprocessing plays a substantial role in attribu-tional activities with many aspects of causalreasoning occurring outside conscious aware-ness In this way emotions bridge the gaps inrationality and enable people to focus theirlimited attentional capacity (Johnson-Laird ampOately 1992 Loewenstein et al 2001 Simon1967) Because the cognitive complexity ofrelationships can exceed a personrsquos capacity toform a complete mental model people cannotperfectly predict behavior and emotions serveto redistribute cognitive resources and manage

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 6: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 55

as the relationship between the true capabili-ties of the agent and the level of trust Secondwe will consider the context defined by thecharacteristics of the individual organizationand culture Figure 1 shows this as a collectionof factors at the top of the diagram each affect-ing a different element of the belief-attitude-intention-behavior sequence associated withreliance on an agent Third we will describe thebasis of trust This defines the different types ofinformation needed to maintain an appropriatelevel of trust which the bottom of Figure 1shows in terms of the purpose process andperformance dimensions that describe the goal-oriented characteristics of the agent Finallyregarding the cognitive process governingtrust trust depends on the interplay among theanalytic analogical and affective processes Ineach of these sections we will cite literatureregarding the organizational sociological inter-personal psychological and neurological per-spectives of human-human trust and then drawparallels with human-automation trust Theunderstanding of trust in automation can bene-fit from a multidisciplinary consideration ofhow context agent characteristics and cognitiveprocesses affect the appropriateness of trust

Appropriate Trust CalibrationResolution and Specificity

Inappropriate reliance associated with mis-use and disuse depends in part on how welltrust matches the true capabilities of the auto-mation Supporting appropriate trust is criticalin avoiding misuse and disuse of automationjust as it is in facilitating effective interpersonalrelationships (Wicks Berman amp Jones 1999)

Calibration resolution and specificity of trustdescribe mismatches between trust and the capa-bilities of automation Calibration refers to thecorrespondence between a personrsquos trust inthe automation and the automationrsquos capabili-ties (Lee amp Moray 1994 Muir 1987) The defi-nitions of appropriate calibration of trust parallelthose of misuse and disuse in describing appro-priate reliance Overtrust is poor calibration inwhich trust exceeds system capabilities withdistrust trust falls short of the automationrsquos ca-pabilities In Figure 2 good calibration is rep-resented by the diagonal line where the levelof trust matches automation capabilities Abovethis line is overtrust and below it is distrust

Resolution refers to how precisely a judg-ment of trust differentiates levels of automation

Automation capability (trustworthiness)

Trust

Overtrust Trust exceeds system capabilities leading to misuse

Distrust Trust falls short of system capabilities leading to disuse

Calibrated trust Trust matches system capabilities leading to appropriate use

Poor resolution A large range of system capability maps onto a small range of trust

Good resolution A range of system capability maps onto the same range of trust

Figure 2 The relationship among calibration resolution and automation capability in defining appropriatetrust in automation Overtrust may lead to misuse and distrust may lead to disuse

56 Spring 2004 ndash Human Factors

capability (Cohen et al 1999) Figure 2 showsthat poor resolution occurs when a large rangeof automation capability maps onto a smallrange of trust With low resolution large chang-es in automation capability are reflected in smallchanges in trust Specificity refers to the degreeto which trust is associated with a particularcomponent or aspect of the trustee Functionalspecificity describes the differentiation of func-tions subfunctions and modes of automationWith high functional specificity a personrsquos trustreflects capabilities of specific subfunctionsand modes Low functional specificity meansthe personrsquos trust reflects the capabilities of theentire system

Specificity can also describe changes in trustas a function of the situation or over time Hightemporal specificity means that a personrsquos trustreflects moment-to-moment fluctuations inautomation capability whereas low temporalspecificity means that the trust reflects onlylong-term changes in automation capabilityAlthough temporal specificity implies a genericchange over time as the personrsquos trust adjuststo failures with the automation temporal speci-ficity also addresses adjustments that shouldoccur when the situation or context changesand affects the capability of the automationTemporal specificity reflects the sensitivity oftrust to changes in context that affect automa-tion capability High functional and temporalspecificity increase the likelihood that the levelof trust will match the capabilities of a particu-lar element of the automation at a particulartime Good calibration high resolution andhigh specificity of trust can mitigate misuse anddisuse of automation and so they can guide de-sign evaluation and training to enhance human-automation partnerships

Individual Organizational and CulturalContext

Trust does not develop in a vacuum but in-stead evolves in a complex individual culturaland organizational context The individual con-text includes individual differences such as thepropensity to trust These differences influencethe initial level of trust and influence how newinformation is interpreted The individual con-text also includes a personrsquos specific history of

interactions that have led to a particular level oftrust The organizational context can also havea strong influence on trust The organizationalcontext reflects the interactions between peoplethat inform them about the trustworthiness ofothers which can include reputation and gossipThe cultural context also influences trust throughsocial norms and expectations Understandingtrust requires a careful consideration of the indi-vidual organizational and cultural context

Systematic individual differences influencetrust between people Some people are moreinclined to trust than are others (Gaines et al1997 Stack 1978) Rotter (1967) defined trustas an enduring personality trait This concep-tion of trust follows a social learning theory ap-proach in which expectations for a particularsituation are determined by specific previousexperiences with situations that are perceivedto be similar (Rotter 1971) People develop be-liefs about others that are generalized and ex-trapolated from one interaction to another Inthis context trust is a generalized expectancythat is independent of specific experiences andbased on the generalization of a large numberof diverse experiences Individual differencesregarding trust have important implications forthe study of human-automation trust becausethey may influence reliance in ways that arenot directly related to the characteristics of theautomation

Substantial evidence demonstrates that thetendency to trust considered as a personalitytrait can be reliably measured and can influencebehavior in a systematic manner For exampleRotterrsquos (1980) Interpersonal Trust Scale reli-ably differentiates people on their propensity totrust others with an internal consistency of 76and a test-retest reliability of 56 with 7 monthsbetween tests Interestingly high-trust individ-uals are not more gullible than low-trust indi-viduals (Gurtman 1992) and the propensity totrust is not correlated with measures of intellect(Rotter 1980) however high-trust individualsare viewed as more trustworthy by others anddisplay more truthful behavior (Rotter1971) Infact people with a high propensity to trust pre-dicted othersrsquo trustworthiness better than thosewith a low propensity to trust (Kikuchi Wanta-nabe amp Yamasishi 1996) Likewise low- andhigh-trust individuals respond differently to

TRUST IN AUTOMATION 57

feedback regarding collaboratorsrsquo intentions andto situational risk (Kramer 1999)

These findings may explain why individualdifferences in the general tendency to trust auto-mation as measured by a complacency scale(Parasuraman Singh Molloy amp Parasuraman1992) are not clearly related to misuse of auto-mation For example Singh Molloy and Para-suraman (1993) found that high-complacencyindividuals actually detected more automationfailures in a constant reliability condition (534compared with 187 for low-complacencyindividuals) This unexpected result is similarto findings in studies of human trust in whichhighly trusting individuals were found to trustmore appropriately Several studies of trust inautomation show that for some people trustchanges substantially as the capability of theautomation changes and for other people trustchanges relatively little (Lee amp Moray1994 Ma-salonis 2000) One possible explanation thatmerits further investigation is that high-trustindividuals may be better able to adjust theirtrust to situations in which the automation ishighly capable as well as to situations in whichit is not

In contrast to the description of trust as astable personality trait most researchers havefocused on trust as an attitude (Jones amp George1998) In describing interpersonal relationshipstrust has been considered as a dynamic attitudethat evolves along with the developing rela-tionship (Rempel et al 1985) The influenceof individual differences regarding the predis-position to trust is most important when a situa-tion is ambiguous and generalized expectanciesdominate and it becomes less important as therelationship progresses (McKnight Cummingsamp Chervany 1998) Trust as an attitude is ahistory-dependent variable that depends on theprior behavior of the trusted person and the in-formation that is shared (Deutsch 1958) Theinitial level of trust is determined by past expe-riences in similar situations some of these ex-periences may even be indirect as in the caseof gossip

The organizational context influences howreputation gossip and formal and informalroles affect the trust of people who have neverhad any direct contact with the trustee (Burt ampKnez 1996 Kramer 1999) For example engi-

neers are trusted not because of the ability ofany specific person but because of the underly-ing education and regulatory structure that gov-erns people in the role of an engineer (Kramer1999) The degree to which the organizationalcontext affects the indirect development of trustdepends on the strength of links in the socialnetwork and on the trustorrsquos ability to establishlinks between the situations experienced by oth-ers and the situations he or she is confronting(Doney et al 1998) These findings suggestthat the organizational context and the indirectexposure that it facilitates can have an impor-tant influence on trust and reliance

Beyond the individual and organizationalcontext culture is another element of contextthat can influence trust and its development(Baba Falkenburg amp Hill 1996) Culture canbe defined as a set of social norms and expec-tations that reflect shared educational and lifeexperiences associated with national differencesor distinct cohorts of workers In particularJapanese citizens have been found to have a gen-erally low level of trust (Yamagishi amp Yama-gishi 1994) In Japan networks of mutuallycommitted relations play a more important rolein governing exchange relationships than theydo in the United States A cross-national ques-tionnaire that included 1136 Japanese and 501American respondents showed that the Ameri-can respondents were more trusting of otherpeople in general and considered reputation moreimportant In contrast the Japanese respondentsplaced a higher value on exchange relationshipsthat are based on personal relationships Oneexplanation for this is that the extreme socialstability of mutually committed relationships in Japan reduces uncertainty about transactionsand diminishes the role of trust (Doney et al1998)

More generally cultural differences associat-ed with power distance (eg dependence onauthority and respect for authoritarian norms)uncertainty avoidance and individualist andcollectivist attitudes can influence the develop-ment and role of trust (Doney et al 1998) Cul-tural variation can influence trust in an on-lineenvironment Karvonen (2001) compared trustin E-commerce service among consumers fromFinland Sweden and Iceland Although alltrusted a simple design there were substantial

58 Spring 2004 ndash Human Factors

differences in the initial trust in a complexdesign with Finnish customers being the mostwary and Icelandic customers the most trustingMore generally the level of social trust variessubstantially among countries and this variationaccounts for over 64 of the variance in the lev-el of Internet adoption (Huang Keser Lelandamp Shachat 2002) The culturally dependentnature of trust suggests that findings regardingtrust in automation may need to be verifiedwhen they are extrapolated from one culturalsetting to another

Cultural context can also describe systematicdifferences in groups of workers For exam-ple in the context of trust in automation Riley(1996) found that pilots who are accustomedto using automation trusted and relied on auto-mation more than did students who were notas familiar with automation Likewise in fieldstudies of operators adapting to computerizedmanufacturing systems Zuboff (1988) foundthat the culture associated with those operatorswho had been exposed to computer systems ledto greater trust and acceptance of automationThese results show that trust in automation liketrust in people is culturally influenced in thatit depends on the long-term experiences of agroup of people

Trust between people depends on the individ-ual organizational and cultural context Thiscontext influences trust because it affects initiallevels of trust and how people interpret infor-mation regarding the agent Some evidenceshows that these relationships may also hold fortrust in automation however very little researchhas investigated any of these factors We ad-dress the types of information that form thebasis of trust in the next section

Basis of Trust Levels of Detail and ofAttributional Abstraction

Trust is a multidimensional construct that isbased on trustee characteristics such as motivesintentions and actions (Bhattacharya Devinneyamp Pillutla 1998 Mayer et al 1995) The basisfor trust is the information that informs the per-son about the ability of the trustee to achievethe goals of the trustor Two critical elementsdefine the basis of trust The first is the focusof trust What is to be trusted The second isthe type of information that describes the entity

to be trusted What is the information support-ing trust This information guides expectationsregarding how well the trustee can achieve thetrustorrsquos goals The focus of trust can be de-scribed according to levels of detail and theinformation supporting trust can be describedaccording to three levels of attributional ab-straction

The focus of trust can be considered along adimension defined by the level of detail whichvaries from general trust of institutions andorganizations to trust in a specific agent Withautomation this might correspond to trust in anoverall system of automation as compared withtrust in a particular mode of an automatic con-troller Couch and Jones (1997) considered threelevels of trust ndash generalized network and part-ner trust ndash which are similar to the three levelsidentified by McKnight et al (1998) Generaltrust corresponds to trust in human nature andinstitutions (Barber 1983) Network trust refersto the cluster of acquaintances that constitutesa personrsquos social network whereas partner trustcorresponds to trust in specific persons (Rempelet al 1985)

There does not seem to be a reliable rela-tionship between global and specific trust sug-gesting that this dimension makes importantdistinctions regarding the evolution of trust(Couch amp Jones 1997) For example trust in asupervisor and trust in an organization are dis-tinct and depend on qualitatively different fac-tors (Tan amp Tan 2000) Trust in a supervisordepends on perceived ability integrity and be-nevolence whereas trust in the organizationdepends on more distal variables of perceivedorganizational support and justice General trustis frequently considered to be a personality traitwhereas specific trust is frequently viewed as anoutcome of a specific interaction (Couch ampJones 1997) Several studies show howeverthat general and specific trust are not necessarilytied to the ldquotrait and staterdquo distinctions and mayinstead simply refer to different levels of detail(Sitkin amp Roth 1993) Developing a high degreeof functional specificity in the trust of automa-tion may require specific information for eachlevel of detail of the automation

The basis of trust can be considered along adimension of attributional abstraction whichvaries from demonstrations of competence to

TRUST IN AUTOMATION 59

the intentions of the agent A recent review oftrust literature concluded that three general lev-els summarize the bases of trust ability integrityand benevolence (Mayer et al 1995) Ability isthe group of skills competencies and character-istics that enable the trustee to influence the do-main Integrity is the degree to which the trusteeadheres to a set of principles the trustor findsacceptable Benevolence is the extent to whichthe intents and motivations of the trustee arealigned with those of the trustor

In the context of interpersonal relationshipsRempel et al (1985) viewed trust as an evolvingphenomenon with the basis of trust changing asthe relationship progressed They argued thatpredictability the degree to which future be-havior can be anticipated (and which is similarto ability) forms the basis of trust early in arelationship This is followed by dependabilitywhich is the degree to which behavior is con-sistent and is similar to integrity As the rela-tionship matures the basis of trust ultimatelyshifts to faith which is a more general judgmentthat a person can be relied upon and is similarto benevolence A similar progression emergedin a study of operatorsrsquo adaptation to new tech-nology (Zuboff 1988) Trust in that context de-pended on trial-and-error experience followedby understanding of the technologyrsquos operationand finally faith Lee and Moray (1992) madesimilar distinctions in defining the factors thatinfluence trust in automation They identifiedperformance process and purpose as the gen-eral bases of trust

Performance refers to the current and histor-ical operation of the automation and includescharacteristics such as reliability predictabilityand ability Performance information describeswhat the automation does More specificallyperformance refers to the competency or exper-tise as demonstrated by its ability to achievethe operatorrsquos goals Because performance islinked to the ability to achieve specific goals itdemonstrates the task- and situation-dependentnature of trust This is similar to Sheridanrsquos(1992) concept of robustness as a basis for trustin human-automation relationships The operatorwill tend to trust automation that performs in amanner that reliably achieves his or her goals

Process is the degree to which the automa-tionrsquos algorithms are appropriate for the situation

and able to achieve the operatorrsquos goals Pro-cess information describes how the automationoperates In interpersonal relationships thiscorresponds to the consistency of actions associ-ated with adherence to a set of acceptable prin-ciples (Mayer et al 1995) Process as a basisfor trust reflects a shift away from focus on spe-cific behaviors and toward qualities and char-acteristics attributed to an agent With processtrust is in the agent and not in the specificactions of the agent Because of this the processbasis of trust relies on dispositional attributionsand inferences drawn from the performance ofthe agent In this sense process is similar todependability and integrity Openness the will-ingness to give and receive ideas is anotherelement of the process basis of trust Interesting-ly consistency and openness are more importantfor trust in peers than for trust in supervisors orsubordinates (Schindler amp Thomas 1993)

In the context of automation the processbasis of trust refers to the algorithms and oper-ations that govern the behavior of the auto-mation This concept is similar to Sheridanrsquos(1992) concept of understandability in human-automation relationships The operator willtend to trust the automation if its algorithmscan be understood and seem capable of achiev-ing the operatorrsquos goals in the current situation

Purpose refers to the degree to which theautomation is being used within the realm ofthe designerrsquos intent Purpose describes why theautomation was developed Purpose correspondsto faith and benevolence and reflects the percep-tion that the trustee has a positive orientationtoward the trustor With interpersonal relation-ships the perception of such a positive orienta-tion depends on the intentions and motives ofthe trustee This can take the form of abstractgeneralized value congruence (Sitkin amp Roth1993) which can be described as whether andto what extent the trustee has a motive to lie(Hovland Janis amp Kelly 1953) The purposebasis of trust reflects the attribution of thesecharacteristics to the automation Frequentlywhether or not this attribution takes place willdepend on whether the designerrsquos intent has beencommunicated to the operator If so the opera-tor will tend to trust automation to achieve thegoals it was designed to achieve

Table 1 builds on the work of Mayer et al

60 Spring 2004 ndash Human Factors

TABLE 1 Summary of the Dimensions That Describe the Basis of Trust

Study Basis of Trust Summary Dimension

Barber (1983) Competence PerformancePersistence ProcessFiduciary responsibility Purpose

Butler amp Cantrell (1984) Competence PerformanceIntegrity ProcessConsistency ProcessLoyalty PurposeOpenness Process

Cook amp Wall (1980) Ability PerformanceIntentions Purpose

Deutsch (1960) Ability PerformanceIntentions Purpose

Gabarro (1978) Integrity ProcessMotives PurposeOpenness ProcessDiscreetness ProcessFunctionalspecific competence PerformanceInterpersonal competence PerformanceBusiness sense PerformanceJudgment Performance

Hovland Janis amp Kelly (1953) Expertise PerformanceMotivation to lie Purpose

Jennings (1967) Loyalty PurposePredictability ProcessAccessibility ProcessAvailability Process

Kee amp Knox (1970) Competence PerformanceMotives Purpose

Mayer et al (1995) Ability PerformanceIntegrity ProcessBenevolence Purpose

Mishra (1996) Competency PerformanceReliability PerformanceOpenness ProcessConcern Purpose

Moorman et al (1993) Integrity ProcessWillingness to reduce uncertainty ProcessConfidentiality ProcessExpertise PerformanceTactfulness ProcessSincerity ProcessCongeniality PerformanceTimeliness Performance

Rempel et al (1985) Reliability PerformanceDependability ProcessFaith Purpose

Sitkin amp Roth (1993) Context-specific reliability PerformanceGeneralized value congruence Purpose

Zuboff (1988) Trial and error experience PerformanceUnderstanding ProcessLeap of faith Purpose

TRUST IN AUTOMATION 61

(1995) and shows how the many characteristicsof a trustee that affect trust can be summarizedby the three bases of performance process andpurpose As an example Mishra (1996) com-bined a literature review and interviews with33 managers to identify four dimensions oftrust These dimensions can be linked to thethree bases of trust competency (performance)reliability (performance) openness (process) andconcern (purpose) The dimensions of purposeprocess and performance provide a concise setof distinctions that describe the basis of trustacross a wide range of application domainsThese dimensions clearly identify three typesof goal-oriented information that contribute todeveloping an appropriate level of trust

Trust depends not only on the observationsassociated with these three dimensions but alsoon the inferences drawn among them Observa-tion of performance can support inferences re-garding internal mechanisms associated withthe dimension of process analysis of internalmechanisms can support inferences regardingthe designerrsquos intent associated with the dimen-sion of purpose Likewise the underlying pro-cess can be inferred from knowledge of thedesignerrsquos intent and performance can be esti-mated from an understanding of the underlyingprocess If these inferences support trust it fol-lows that a system design that facilitates themwould promote a more appropriate level of trustIf inferences are inconsistent with observationsthen trust will probably suffer because of apoor correspondence with the observed agentSimilarly if inferences between levels are incon-sistent trust will suffer because of poor coher-ence For example inconsistency between theintentions conveyed by the manager (purposebasis for trust) and the managerrsquos actions (perfor-mance basis of trust) have a particularly negativeeffect on trust (Gabarro 1978) Likewise the ef-fort to demonstrate reliability and encouragetrust by enforcing procedural approaches maynot succeed if value-oriented concerns (egpurpose) are ignored (Sitkin amp Roth 1993)

In conditions where trust is based on severalfactors it can be quite stable or robust in situ-ations where trust depends on a single basis itcan be quite volatile or fragile (McKnight et al1998) Trust that is based on an understandingof the motives of the agent will be less fragile

than trust based only on the reliability of theagentrsquos performance (Rempel et al 1985)These findings help explain why trust in auto-mation can sometimes appear quite robust andat other times quite fragile (Kantowitz Hanow-ski amp Kantowitz 1997b) Both imperfect co-herence between dimensions and imperfectcorrespondence with observations are likely toundermine trust and are critical considerationsfor encouraging appropriate levels of trust

The availability of information at the differ-ent levels of detail and attributional abstractionmay lead to high calibration high resolutionand great temporal and functional specificity oftrust Designing interfaces and training to pro-vide operators with information regarding thepurpose process and performance of automa-tion could enhance the appropriateness of trustHowever the mere availability of informationwill not ensure appropriate trust The informa-tion must be presented in a manner that is con-sistent with the cognitive processes underlyingthe development of trust as described in thefollowing section

Analytic Analogical and AffectiveProcesses Governing Trust

Information that forms the basis of trust canbe assimilated in three qualitatively differentways Some argue that trust is based on an ana-lytic process others argue that trust dependson an analogical process of assessing categorymembership Most importantly trust also seemsto depend on an affective response to the vio-lation or confirmation of implicit expectan-cies Figure 3 shows how these processes mayinteract to influence trust Ultimately trust is anaffective response but it is also influenced byanalytic and analogical processes The bold linesin Figure 3 show that the affective process hasa greater influence on the analytic process thanthe analytic has on the affective (LoewensteinWeber Hsee amp Welch 2001) The roles of ana-lytic analogical and affective processes dependon the evolution of the relationship betweenthe trustor and trustee the information avail-able to the trustor and the way the informationis displayed

A dominant approach to trust in the organi-zational sciences considers trust from a rationalchoice perspective (Hardin 1992) This view

62 Spring 2004 ndash Human Factors

suggests that trust depends on an evaluationmade under uncertainty in which decision mak-ers use their knowledge of the motivations andinterests of the other party to maximize gainsand minimize losses Individuals are assumed tomake choices based on a rationally derived as-sessment of costs and benefits (Lewicki amp Bun-ker 1996) Williamson (1993) argued that trustis primarily analytic in business and commercialrelations and that it represents one aspect ofrational risk analysis Trust accumulates and dis-sipates based on the effect of cumulative inter-actions with the other agent These interactionslead to an accumulation of knowledge that es-tablishes trust through an expected value cal-culation in which the probabilities of variousoutcomes are estimated with increasing experi-ence (Holmes 1991) Trust can also depend onan analysis of how organizational and individualconstraints affect performance For exampletrust can be considered in terms of a rationalargument in which grounds for various conclu-sions are developed and defended according tothe evidence (Cohen 2000)

These approaches provide a useful evaluationof the normative level of trust but they may notdescribe how trust actually develops Analyticapproaches to trust probably overstate cognitivecapacities and understate the influence of affectand role of strategies to accommodate the lim-its of bounded rationality similar to normative

decision-making theories (Janis amp Mann 1977Simon 1957) Descriptive accounts of decisionmaking show that expert decision makers sel-dom engage in conscious calculations or in anexhaustive comparison of alternatives (Cross-man 1974 Klein 1989) The analytic processis similar to knowledge-based performance asdescribed by Rasmussen (1983) in which infor-mation is processed and plans are formulatedand evaluated using a function-based mentalmodel of the system Knowledge-based or ana-lytic processes are probably complemented byless cognitively demanding processes

A less cognitively demanding process is fortrust to develop according to analogical judg-ments that link levels of trust to characteristicsof the agent and environmental context Thesecategories develop through direct observationof the trusted agent intermediaries that conveytheir observations and presumptions based onstandards category membership and procedures(Kramer 1999)

Explicit and implicit rules frequently guideindividual and collective collaborative behavior(Mishra 1996) Sitkin and Roth (1993) illus-trated the importance of rules in their distinctionbetween trust as an institutional arrangementand interpersonal trust Institutional trust de-pends on contracts legal procedures and for-mal rules whereas interpersonal trust is basedon shared values The application of rules in

Figure 3 The interplay among the analytic analogical and affective process underlying trust with the analyticand analogical processes both contributing to the affective process and the affective process also having adominant influence on the analytic and analogical processes

TRUST IN AUTOMATION 63

institution-based trust reflects guarantees andsafety nets based on the organizational con-straints of regulations and legal recourses thatconstrain behavior and make it more possibleto trust (McKnight et al 1998) However whenfundamental values which frequently supportinterpersonal trust are violated legalistic ap-proaches and applications of rules to restoretrust are ineffective and can further underminetrust (Sitkin amp Roth 1993) When rules areconsistent with trustworthy behavior they canincrease peoplersquos expectation of satisfactoryperformance during times of normal operationby pairing situations with rule-based expecta-tions but rules can have a negative effect whensituations diverge from normal operation

Analogical trust can also depend on interme-diaries who can convey information to supportjudgments of trust (Burt amp Knez 1996) suchas reputations and gossip that enable individualsto develop trust without any direct contact Inthe context of trust in automation responsetimes to warnings tend to increase when falsealarms occur This effect was counteracted bygossip that suggested that the rate of false alarmswas lower than it actually was (Bliss Dunn ampFuller 1995) Similarly trust can be based onpresumptions of the trustworthiness of a cate-gory or role of the person rather than on theactual performance (Kramer Brewer amp Hanna1996) If trust is primarily based on rules thatcharacterize performance during normal situa-tions then abnormal situations or erroneouscategory membership might lead to the collapseof trust because the assurances associated withnormal or customary operation are no longervalid Because of this category-based trust canbe fragile particularly when abnormal situationsdisrupt otherwise stable roles For example whenroles were defined by a binding contract trust de-clined substantially when the contracts wereremoved (Malhotra amp Murnighan 2002)

Similar effects may occur with trust in auto-mation Specifically Miller (2002) suggested thatcomputer etiquette may have an important influ-ence on human-computer interaction Etiquettemay influence trust because category member-ship associated with adherence to a particularetiquette helps people to infer how the automa-tion will perform Intentionally developing com-puter etiquette could promote appropriate trust

but it could lead to inappropriate trust if peopleinfer inappropriate category memberships anddevelop distorted expectations regarding thecapability of the automation

These studies suggest that trust in automa-tion may depend on category judgments basedon a variety of sources that include direct andindirect interaction with the automation Be-cause these analogical judgments emerge fromcomplex interactions between people proce-dures and prolonged experience data from lab-oratory experiments may need to be carefullyassessed to determine how well they generalizeto actual operational settings In this way theanalogical process governing trust correspondsvery closely to Rasmussenrsquos (1983) concept ofrule-based behavior in which condition-actionpairings or procedures guide behavior

Analytic and analogical processes do not ac-count for the affective aspects of trust whichrepresent the core influence of trust on behav-ior (Kramer 1999) Emotional responses arecritical because people not only think abouttrust they also feel it (Fine amp Holyfield 1996)Emotions fluctuate over time according to theperformance of the trustee and they can signalinstances where expectations do not conform tothe ongoing experience As trust is betrayedemotions provide signals concerning the chang-ing nature of the situation (Jones amp George1998)

Berscheid (1994) used the concept of auto-matic processing to describe how behaviors thatare relevant to frequently accessed trait con-structs such as trust are likely to be perceivedunder attentional overload situations Becauseof this people process observations of new be-havior according to old constructs withoutalways being aware of doing so Automaticprocessing plays a substantial role in attribu-tional activities with many aspects of causalreasoning occurring outside conscious aware-ness In this way emotions bridge the gaps inrationality and enable people to focus theirlimited attentional capacity (Johnson-Laird ampOately 1992 Loewenstein et al 2001 Simon1967) Because the cognitive complexity ofrelationships can exceed a personrsquos capacity toform a complete mental model people cannotperfectly predict behavior and emotions serveto redistribute cognitive resources and manage

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 7: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

56 Spring 2004 ndash Human Factors

capability (Cohen et al 1999) Figure 2 showsthat poor resolution occurs when a large rangeof automation capability maps onto a smallrange of trust With low resolution large chang-es in automation capability are reflected in smallchanges in trust Specificity refers to the degreeto which trust is associated with a particularcomponent or aspect of the trustee Functionalspecificity describes the differentiation of func-tions subfunctions and modes of automationWith high functional specificity a personrsquos trustreflects capabilities of specific subfunctionsand modes Low functional specificity meansthe personrsquos trust reflects the capabilities of theentire system

Specificity can also describe changes in trustas a function of the situation or over time Hightemporal specificity means that a personrsquos trustreflects moment-to-moment fluctuations inautomation capability whereas low temporalspecificity means that the trust reflects onlylong-term changes in automation capabilityAlthough temporal specificity implies a genericchange over time as the personrsquos trust adjuststo failures with the automation temporal speci-ficity also addresses adjustments that shouldoccur when the situation or context changesand affects the capability of the automationTemporal specificity reflects the sensitivity oftrust to changes in context that affect automa-tion capability High functional and temporalspecificity increase the likelihood that the levelof trust will match the capabilities of a particu-lar element of the automation at a particulartime Good calibration high resolution andhigh specificity of trust can mitigate misuse anddisuse of automation and so they can guide de-sign evaluation and training to enhance human-automation partnerships

Individual Organizational and CulturalContext

Trust does not develop in a vacuum but in-stead evolves in a complex individual culturaland organizational context The individual con-text includes individual differences such as thepropensity to trust These differences influencethe initial level of trust and influence how newinformation is interpreted The individual con-text also includes a personrsquos specific history of

interactions that have led to a particular level oftrust The organizational context can also havea strong influence on trust The organizationalcontext reflects the interactions between peoplethat inform them about the trustworthiness ofothers which can include reputation and gossipThe cultural context also influences trust throughsocial norms and expectations Understandingtrust requires a careful consideration of the indi-vidual organizational and cultural context

Systematic individual differences influencetrust between people Some people are moreinclined to trust than are others (Gaines et al1997 Stack 1978) Rotter (1967) defined trustas an enduring personality trait This concep-tion of trust follows a social learning theory ap-proach in which expectations for a particularsituation are determined by specific previousexperiences with situations that are perceivedto be similar (Rotter 1971) People develop be-liefs about others that are generalized and ex-trapolated from one interaction to another Inthis context trust is a generalized expectancythat is independent of specific experiences andbased on the generalization of a large numberof diverse experiences Individual differencesregarding trust have important implications forthe study of human-automation trust becausethey may influence reliance in ways that arenot directly related to the characteristics of theautomation

Substantial evidence demonstrates that thetendency to trust considered as a personalitytrait can be reliably measured and can influencebehavior in a systematic manner For exampleRotterrsquos (1980) Interpersonal Trust Scale reli-ably differentiates people on their propensity totrust others with an internal consistency of 76and a test-retest reliability of 56 with 7 monthsbetween tests Interestingly high-trust individ-uals are not more gullible than low-trust indi-viduals (Gurtman 1992) and the propensity totrust is not correlated with measures of intellect(Rotter 1980) however high-trust individualsare viewed as more trustworthy by others anddisplay more truthful behavior (Rotter1971) Infact people with a high propensity to trust pre-dicted othersrsquo trustworthiness better than thosewith a low propensity to trust (Kikuchi Wanta-nabe amp Yamasishi 1996) Likewise low- andhigh-trust individuals respond differently to

TRUST IN AUTOMATION 57

feedback regarding collaboratorsrsquo intentions andto situational risk (Kramer 1999)

These findings may explain why individualdifferences in the general tendency to trust auto-mation as measured by a complacency scale(Parasuraman Singh Molloy amp Parasuraman1992) are not clearly related to misuse of auto-mation For example Singh Molloy and Para-suraman (1993) found that high-complacencyindividuals actually detected more automationfailures in a constant reliability condition (534compared with 187 for low-complacencyindividuals) This unexpected result is similarto findings in studies of human trust in whichhighly trusting individuals were found to trustmore appropriately Several studies of trust inautomation show that for some people trustchanges substantially as the capability of theautomation changes and for other people trustchanges relatively little (Lee amp Moray1994 Ma-salonis 2000) One possible explanation thatmerits further investigation is that high-trustindividuals may be better able to adjust theirtrust to situations in which the automation ishighly capable as well as to situations in whichit is not

In contrast to the description of trust as astable personality trait most researchers havefocused on trust as an attitude (Jones amp George1998) In describing interpersonal relationshipstrust has been considered as a dynamic attitudethat evolves along with the developing rela-tionship (Rempel et al 1985) The influenceof individual differences regarding the predis-position to trust is most important when a situa-tion is ambiguous and generalized expectanciesdominate and it becomes less important as therelationship progresses (McKnight Cummingsamp Chervany 1998) Trust as an attitude is ahistory-dependent variable that depends on theprior behavior of the trusted person and the in-formation that is shared (Deutsch 1958) Theinitial level of trust is determined by past expe-riences in similar situations some of these ex-periences may even be indirect as in the caseof gossip

The organizational context influences howreputation gossip and formal and informalroles affect the trust of people who have neverhad any direct contact with the trustee (Burt ampKnez 1996 Kramer 1999) For example engi-

neers are trusted not because of the ability ofany specific person but because of the underly-ing education and regulatory structure that gov-erns people in the role of an engineer (Kramer1999) The degree to which the organizationalcontext affects the indirect development of trustdepends on the strength of links in the socialnetwork and on the trustorrsquos ability to establishlinks between the situations experienced by oth-ers and the situations he or she is confronting(Doney et al 1998) These findings suggestthat the organizational context and the indirectexposure that it facilitates can have an impor-tant influence on trust and reliance

Beyond the individual and organizationalcontext culture is another element of contextthat can influence trust and its development(Baba Falkenburg amp Hill 1996) Culture canbe defined as a set of social norms and expec-tations that reflect shared educational and lifeexperiences associated with national differencesor distinct cohorts of workers In particularJapanese citizens have been found to have a gen-erally low level of trust (Yamagishi amp Yama-gishi 1994) In Japan networks of mutuallycommitted relations play a more important rolein governing exchange relationships than theydo in the United States A cross-national ques-tionnaire that included 1136 Japanese and 501American respondents showed that the Ameri-can respondents were more trusting of otherpeople in general and considered reputation moreimportant In contrast the Japanese respondentsplaced a higher value on exchange relationshipsthat are based on personal relationships Oneexplanation for this is that the extreme socialstability of mutually committed relationships in Japan reduces uncertainty about transactionsand diminishes the role of trust (Doney et al1998)

More generally cultural differences associat-ed with power distance (eg dependence onauthority and respect for authoritarian norms)uncertainty avoidance and individualist andcollectivist attitudes can influence the develop-ment and role of trust (Doney et al 1998) Cul-tural variation can influence trust in an on-lineenvironment Karvonen (2001) compared trustin E-commerce service among consumers fromFinland Sweden and Iceland Although alltrusted a simple design there were substantial

58 Spring 2004 ndash Human Factors

differences in the initial trust in a complexdesign with Finnish customers being the mostwary and Icelandic customers the most trustingMore generally the level of social trust variessubstantially among countries and this variationaccounts for over 64 of the variance in the lev-el of Internet adoption (Huang Keser Lelandamp Shachat 2002) The culturally dependentnature of trust suggests that findings regardingtrust in automation may need to be verifiedwhen they are extrapolated from one culturalsetting to another

Cultural context can also describe systematicdifferences in groups of workers For exam-ple in the context of trust in automation Riley(1996) found that pilots who are accustomedto using automation trusted and relied on auto-mation more than did students who were notas familiar with automation Likewise in fieldstudies of operators adapting to computerizedmanufacturing systems Zuboff (1988) foundthat the culture associated with those operatorswho had been exposed to computer systems ledto greater trust and acceptance of automationThese results show that trust in automation liketrust in people is culturally influenced in thatit depends on the long-term experiences of agroup of people

Trust between people depends on the individ-ual organizational and cultural context Thiscontext influences trust because it affects initiallevels of trust and how people interpret infor-mation regarding the agent Some evidenceshows that these relationships may also hold fortrust in automation however very little researchhas investigated any of these factors We ad-dress the types of information that form thebasis of trust in the next section

Basis of Trust Levels of Detail and ofAttributional Abstraction

Trust is a multidimensional construct that isbased on trustee characteristics such as motivesintentions and actions (Bhattacharya Devinneyamp Pillutla 1998 Mayer et al 1995) The basisfor trust is the information that informs the per-son about the ability of the trustee to achievethe goals of the trustor Two critical elementsdefine the basis of trust The first is the focusof trust What is to be trusted The second isthe type of information that describes the entity

to be trusted What is the information support-ing trust This information guides expectationsregarding how well the trustee can achieve thetrustorrsquos goals The focus of trust can be de-scribed according to levels of detail and theinformation supporting trust can be describedaccording to three levels of attributional ab-straction

The focus of trust can be considered along adimension defined by the level of detail whichvaries from general trust of institutions andorganizations to trust in a specific agent Withautomation this might correspond to trust in anoverall system of automation as compared withtrust in a particular mode of an automatic con-troller Couch and Jones (1997) considered threelevels of trust ndash generalized network and part-ner trust ndash which are similar to the three levelsidentified by McKnight et al (1998) Generaltrust corresponds to trust in human nature andinstitutions (Barber 1983) Network trust refersto the cluster of acquaintances that constitutesa personrsquos social network whereas partner trustcorresponds to trust in specific persons (Rempelet al 1985)

There does not seem to be a reliable rela-tionship between global and specific trust sug-gesting that this dimension makes importantdistinctions regarding the evolution of trust(Couch amp Jones 1997) For example trust in asupervisor and trust in an organization are dis-tinct and depend on qualitatively different fac-tors (Tan amp Tan 2000) Trust in a supervisordepends on perceived ability integrity and be-nevolence whereas trust in the organizationdepends on more distal variables of perceivedorganizational support and justice General trustis frequently considered to be a personality traitwhereas specific trust is frequently viewed as anoutcome of a specific interaction (Couch ampJones 1997) Several studies show howeverthat general and specific trust are not necessarilytied to the ldquotrait and staterdquo distinctions and mayinstead simply refer to different levels of detail(Sitkin amp Roth 1993) Developing a high degreeof functional specificity in the trust of automa-tion may require specific information for eachlevel of detail of the automation

The basis of trust can be considered along adimension of attributional abstraction whichvaries from demonstrations of competence to

TRUST IN AUTOMATION 59

the intentions of the agent A recent review oftrust literature concluded that three general lev-els summarize the bases of trust ability integrityand benevolence (Mayer et al 1995) Ability isthe group of skills competencies and character-istics that enable the trustee to influence the do-main Integrity is the degree to which the trusteeadheres to a set of principles the trustor findsacceptable Benevolence is the extent to whichthe intents and motivations of the trustee arealigned with those of the trustor

In the context of interpersonal relationshipsRempel et al (1985) viewed trust as an evolvingphenomenon with the basis of trust changing asthe relationship progressed They argued thatpredictability the degree to which future be-havior can be anticipated (and which is similarto ability) forms the basis of trust early in arelationship This is followed by dependabilitywhich is the degree to which behavior is con-sistent and is similar to integrity As the rela-tionship matures the basis of trust ultimatelyshifts to faith which is a more general judgmentthat a person can be relied upon and is similarto benevolence A similar progression emergedin a study of operatorsrsquo adaptation to new tech-nology (Zuboff 1988) Trust in that context de-pended on trial-and-error experience followedby understanding of the technologyrsquos operationand finally faith Lee and Moray (1992) madesimilar distinctions in defining the factors thatinfluence trust in automation They identifiedperformance process and purpose as the gen-eral bases of trust

Performance refers to the current and histor-ical operation of the automation and includescharacteristics such as reliability predictabilityand ability Performance information describeswhat the automation does More specificallyperformance refers to the competency or exper-tise as demonstrated by its ability to achievethe operatorrsquos goals Because performance islinked to the ability to achieve specific goals itdemonstrates the task- and situation-dependentnature of trust This is similar to Sheridanrsquos(1992) concept of robustness as a basis for trustin human-automation relationships The operatorwill tend to trust automation that performs in amanner that reliably achieves his or her goals

Process is the degree to which the automa-tionrsquos algorithms are appropriate for the situation

and able to achieve the operatorrsquos goals Pro-cess information describes how the automationoperates In interpersonal relationships thiscorresponds to the consistency of actions associ-ated with adherence to a set of acceptable prin-ciples (Mayer et al 1995) Process as a basisfor trust reflects a shift away from focus on spe-cific behaviors and toward qualities and char-acteristics attributed to an agent With processtrust is in the agent and not in the specificactions of the agent Because of this the processbasis of trust relies on dispositional attributionsand inferences drawn from the performance ofthe agent In this sense process is similar todependability and integrity Openness the will-ingness to give and receive ideas is anotherelement of the process basis of trust Interesting-ly consistency and openness are more importantfor trust in peers than for trust in supervisors orsubordinates (Schindler amp Thomas 1993)

In the context of automation the processbasis of trust refers to the algorithms and oper-ations that govern the behavior of the auto-mation This concept is similar to Sheridanrsquos(1992) concept of understandability in human-automation relationships The operator willtend to trust the automation if its algorithmscan be understood and seem capable of achiev-ing the operatorrsquos goals in the current situation

Purpose refers to the degree to which theautomation is being used within the realm ofthe designerrsquos intent Purpose describes why theautomation was developed Purpose correspondsto faith and benevolence and reflects the percep-tion that the trustee has a positive orientationtoward the trustor With interpersonal relation-ships the perception of such a positive orienta-tion depends on the intentions and motives ofthe trustee This can take the form of abstractgeneralized value congruence (Sitkin amp Roth1993) which can be described as whether andto what extent the trustee has a motive to lie(Hovland Janis amp Kelly 1953) The purposebasis of trust reflects the attribution of thesecharacteristics to the automation Frequentlywhether or not this attribution takes place willdepend on whether the designerrsquos intent has beencommunicated to the operator If so the opera-tor will tend to trust automation to achieve thegoals it was designed to achieve

Table 1 builds on the work of Mayer et al

60 Spring 2004 ndash Human Factors

TABLE 1 Summary of the Dimensions That Describe the Basis of Trust

Study Basis of Trust Summary Dimension

Barber (1983) Competence PerformancePersistence ProcessFiduciary responsibility Purpose

Butler amp Cantrell (1984) Competence PerformanceIntegrity ProcessConsistency ProcessLoyalty PurposeOpenness Process

Cook amp Wall (1980) Ability PerformanceIntentions Purpose

Deutsch (1960) Ability PerformanceIntentions Purpose

Gabarro (1978) Integrity ProcessMotives PurposeOpenness ProcessDiscreetness ProcessFunctionalspecific competence PerformanceInterpersonal competence PerformanceBusiness sense PerformanceJudgment Performance

Hovland Janis amp Kelly (1953) Expertise PerformanceMotivation to lie Purpose

Jennings (1967) Loyalty PurposePredictability ProcessAccessibility ProcessAvailability Process

Kee amp Knox (1970) Competence PerformanceMotives Purpose

Mayer et al (1995) Ability PerformanceIntegrity ProcessBenevolence Purpose

Mishra (1996) Competency PerformanceReliability PerformanceOpenness ProcessConcern Purpose

Moorman et al (1993) Integrity ProcessWillingness to reduce uncertainty ProcessConfidentiality ProcessExpertise PerformanceTactfulness ProcessSincerity ProcessCongeniality PerformanceTimeliness Performance

Rempel et al (1985) Reliability PerformanceDependability ProcessFaith Purpose

Sitkin amp Roth (1993) Context-specific reliability PerformanceGeneralized value congruence Purpose

Zuboff (1988) Trial and error experience PerformanceUnderstanding ProcessLeap of faith Purpose

TRUST IN AUTOMATION 61

(1995) and shows how the many characteristicsof a trustee that affect trust can be summarizedby the three bases of performance process andpurpose As an example Mishra (1996) com-bined a literature review and interviews with33 managers to identify four dimensions oftrust These dimensions can be linked to thethree bases of trust competency (performance)reliability (performance) openness (process) andconcern (purpose) The dimensions of purposeprocess and performance provide a concise setof distinctions that describe the basis of trustacross a wide range of application domainsThese dimensions clearly identify three typesof goal-oriented information that contribute todeveloping an appropriate level of trust

Trust depends not only on the observationsassociated with these three dimensions but alsoon the inferences drawn among them Observa-tion of performance can support inferences re-garding internal mechanisms associated withthe dimension of process analysis of internalmechanisms can support inferences regardingthe designerrsquos intent associated with the dimen-sion of purpose Likewise the underlying pro-cess can be inferred from knowledge of thedesignerrsquos intent and performance can be esti-mated from an understanding of the underlyingprocess If these inferences support trust it fol-lows that a system design that facilitates themwould promote a more appropriate level of trustIf inferences are inconsistent with observationsthen trust will probably suffer because of apoor correspondence with the observed agentSimilarly if inferences between levels are incon-sistent trust will suffer because of poor coher-ence For example inconsistency between theintentions conveyed by the manager (purposebasis for trust) and the managerrsquos actions (perfor-mance basis of trust) have a particularly negativeeffect on trust (Gabarro 1978) Likewise the ef-fort to demonstrate reliability and encouragetrust by enforcing procedural approaches maynot succeed if value-oriented concerns (egpurpose) are ignored (Sitkin amp Roth 1993)

In conditions where trust is based on severalfactors it can be quite stable or robust in situ-ations where trust depends on a single basis itcan be quite volatile or fragile (McKnight et al1998) Trust that is based on an understandingof the motives of the agent will be less fragile

than trust based only on the reliability of theagentrsquos performance (Rempel et al 1985)These findings help explain why trust in auto-mation can sometimes appear quite robust andat other times quite fragile (Kantowitz Hanow-ski amp Kantowitz 1997b) Both imperfect co-herence between dimensions and imperfectcorrespondence with observations are likely toundermine trust and are critical considerationsfor encouraging appropriate levels of trust

The availability of information at the differ-ent levels of detail and attributional abstractionmay lead to high calibration high resolutionand great temporal and functional specificity oftrust Designing interfaces and training to pro-vide operators with information regarding thepurpose process and performance of automa-tion could enhance the appropriateness of trustHowever the mere availability of informationwill not ensure appropriate trust The informa-tion must be presented in a manner that is con-sistent with the cognitive processes underlyingthe development of trust as described in thefollowing section

Analytic Analogical and AffectiveProcesses Governing Trust

Information that forms the basis of trust canbe assimilated in three qualitatively differentways Some argue that trust is based on an ana-lytic process others argue that trust dependson an analogical process of assessing categorymembership Most importantly trust also seemsto depend on an affective response to the vio-lation or confirmation of implicit expectan-cies Figure 3 shows how these processes mayinteract to influence trust Ultimately trust is anaffective response but it is also influenced byanalytic and analogical processes The bold linesin Figure 3 show that the affective process hasa greater influence on the analytic process thanthe analytic has on the affective (LoewensteinWeber Hsee amp Welch 2001) The roles of ana-lytic analogical and affective processes dependon the evolution of the relationship betweenthe trustor and trustee the information avail-able to the trustor and the way the informationis displayed

A dominant approach to trust in the organi-zational sciences considers trust from a rationalchoice perspective (Hardin 1992) This view

62 Spring 2004 ndash Human Factors

suggests that trust depends on an evaluationmade under uncertainty in which decision mak-ers use their knowledge of the motivations andinterests of the other party to maximize gainsand minimize losses Individuals are assumed tomake choices based on a rationally derived as-sessment of costs and benefits (Lewicki amp Bun-ker 1996) Williamson (1993) argued that trustis primarily analytic in business and commercialrelations and that it represents one aspect ofrational risk analysis Trust accumulates and dis-sipates based on the effect of cumulative inter-actions with the other agent These interactionslead to an accumulation of knowledge that es-tablishes trust through an expected value cal-culation in which the probabilities of variousoutcomes are estimated with increasing experi-ence (Holmes 1991) Trust can also depend onan analysis of how organizational and individualconstraints affect performance For exampletrust can be considered in terms of a rationalargument in which grounds for various conclu-sions are developed and defended according tothe evidence (Cohen 2000)

These approaches provide a useful evaluationof the normative level of trust but they may notdescribe how trust actually develops Analyticapproaches to trust probably overstate cognitivecapacities and understate the influence of affectand role of strategies to accommodate the lim-its of bounded rationality similar to normative

decision-making theories (Janis amp Mann 1977Simon 1957) Descriptive accounts of decisionmaking show that expert decision makers sel-dom engage in conscious calculations or in anexhaustive comparison of alternatives (Cross-man 1974 Klein 1989) The analytic processis similar to knowledge-based performance asdescribed by Rasmussen (1983) in which infor-mation is processed and plans are formulatedand evaluated using a function-based mentalmodel of the system Knowledge-based or ana-lytic processes are probably complemented byless cognitively demanding processes

A less cognitively demanding process is fortrust to develop according to analogical judg-ments that link levels of trust to characteristicsof the agent and environmental context Thesecategories develop through direct observationof the trusted agent intermediaries that conveytheir observations and presumptions based onstandards category membership and procedures(Kramer 1999)

Explicit and implicit rules frequently guideindividual and collective collaborative behavior(Mishra 1996) Sitkin and Roth (1993) illus-trated the importance of rules in their distinctionbetween trust as an institutional arrangementand interpersonal trust Institutional trust de-pends on contracts legal procedures and for-mal rules whereas interpersonal trust is basedon shared values The application of rules in

Figure 3 The interplay among the analytic analogical and affective process underlying trust with the analyticand analogical processes both contributing to the affective process and the affective process also having adominant influence on the analytic and analogical processes

TRUST IN AUTOMATION 63

institution-based trust reflects guarantees andsafety nets based on the organizational con-straints of regulations and legal recourses thatconstrain behavior and make it more possibleto trust (McKnight et al 1998) However whenfundamental values which frequently supportinterpersonal trust are violated legalistic ap-proaches and applications of rules to restoretrust are ineffective and can further underminetrust (Sitkin amp Roth 1993) When rules areconsistent with trustworthy behavior they canincrease peoplersquos expectation of satisfactoryperformance during times of normal operationby pairing situations with rule-based expecta-tions but rules can have a negative effect whensituations diverge from normal operation

Analogical trust can also depend on interme-diaries who can convey information to supportjudgments of trust (Burt amp Knez 1996) suchas reputations and gossip that enable individualsto develop trust without any direct contact Inthe context of trust in automation responsetimes to warnings tend to increase when falsealarms occur This effect was counteracted bygossip that suggested that the rate of false alarmswas lower than it actually was (Bliss Dunn ampFuller 1995) Similarly trust can be based onpresumptions of the trustworthiness of a cate-gory or role of the person rather than on theactual performance (Kramer Brewer amp Hanna1996) If trust is primarily based on rules thatcharacterize performance during normal situa-tions then abnormal situations or erroneouscategory membership might lead to the collapseof trust because the assurances associated withnormal or customary operation are no longervalid Because of this category-based trust canbe fragile particularly when abnormal situationsdisrupt otherwise stable roles For example whenroles were defined by a binding contract trust de-clined substantially when the contracts wereremoved (Malhotra amp Murnighan 2002)

Similar effects may occur with trust in auto-mation Specifically Miller (2002) suggested thatcomputer etiquette may have an important influ-ence on human-computer interaction Etiquettemay influence trust because category member-ship associated with adherence to a particularetiquette helps people to infer how the automa-tion will perform Intentionally developing com-puter etiquette could promote appropriate trust

but it could lead to inappropriate trust if peopleinfer inappropriate category memberships anddevelop distorted expectations regarding thecapability of the automation

These studies suggest that trust in automa-tion may depend on category judgments basedon a variety of sources that include direct andindirect interaction with the automation Be-cause these analogical judgments emerge fromcomplex interactions between people proce-dures and prolonged experience data from lab-oratory experiments may need to be carefullyassessed to determine how well they generalizeto actual operational settings In this way theanalogical process governing trust correspondsvery closely to Rasmussenrsquos (1983) concept ofrule-based behavior in which condition-actionpairings or procedures guide behavior

Analytic and analogical processes do not ac-count for the affective aspects of trust whichrepresent the core influence of trust on behav-ior (Kramer 1999) Emotional responses arecritical because people not only think abouttrust they also feel it (Fine amp Holyfield 1996)Emotions fluctuate over time according to theperformance of the trustee and they can signalinstances where expectations do not conform tothe ongoing experience As trust is betrayedemotions provide signals concerning the chang-ing nature of the situation (Jones amp George1998)

Berscheid (1994) used the concept of auto-matic processing to describe how behaviors thatare relevant to frequently accessed trait con-structs such as trust are likely to be perceivedunder attentional overload situations Becauseof this people process observations of new be-havior according to old constructs withoutalways being aware of doing so Automaticprocessing plays a substantial role in attribu-tional activities with many aspects of causalreasoning occurring outside conscious aware-ness In this way emotions bridge the gaps inrationality and enable people to focus theirlimited attentional capacity (Johnson-Laird ampOately 1992 Loewenstein et al 2001 Simon1967) Because the cognitive complexity ofrelationships can exceed a personrsquos capacity toform a complete mental model people cannotperfectly predict behavior and emotions serveto redistribute cognitive resources and manage

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 8: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 57

feedback regarding collaboratorsrsquo intentions andto situational risk (Kramer 1999)

These findings may explain why individualdifferences in the general tendency to trust auto-mation as measured by a complacency scale(Parasuraman Singh Molloy amp Parasuraman1992) are not clearly related to misuse of auto-mation For example Singh Molloy and Para-suraman (1993) found that high-complacencyindividuals actually detected more automationfailures in a constant reliability condition (534compared with 187 for low-complacencyindividuals) This unexpected result is similarto findings in studies of human trust in whichhighly trusting individuals were found to trustmore appropriately Several studies of trust inautomation show that for some people trustchanges substantially as the capability of theautomation changes and for other people trustchanges relatively little (Lee amp Moray1994 Ma-salonis 2000) One possible explanation thatmerits further investigation is that high-trustindividuals may be better able to adjust theirtrust to situations in which the automation ishighly capable as well as to situations in whichit is not

In contrast to the description of trust as astable personality trait most researchers havefocused on trust as an attitude (Jones amp George1998) In describing interpersonal relationshipstrust has been considered as a dynamic attitudethat evolves along with the developing rela-tionship (Rempel et al 1985) The influenceof individual differences regarding the predis-position to trust is most important when a situa-tion is ambiguous and generalized expectanciesdominate and it becomes less important as therelationship progresses (McKnight Cummingsamp Chervany 1998) Trust as an attitude is ahistory-dependent variable that depends on theprior behavior of the trusted person and the in-formation that is shared (Deutsch 1958) Theinitial level of trust is determined by past expe-riences in similar situations some of these ex-periences may even be indirect as in the caseof gossip

The organizational context influences howreputation gossip and formal and informalroles affect the trust of people who have neverhad any direct contact with the trustee (Burt ampKnez 1996 Kramer 1999) For example engi-

neers are trusted not because of the ability ofany specific person but because of the underly-ing education and regulatory structure that gov-erns people in the role of an engineer (Kramer1999) The degree to which the organizationalcontext affects the indirect development of trustdepends on the strength of links in the socialnetwork and on the trustorrsquos ability to establishlinks between the situations experienced by oth-ers and the situations he or she is confronting(Doney et al 1998) These findings suggestthat the organizational context and the indirectexposure that it facilitates can have an impor-tant influence on trust and reliance

Beyond the individual and organizationalcontext culture is another element of contextthat can influence trust and its development(Baba Falkenburg amp Hill 1996) Culture canbe defined as a set of social norms and expec-tations that reflect shared educational and lifeexperiences associated with national differencesor distinct cohorts of workers In particularJapanese citizens have been found to have a gen-erally low level of trust (Yamagishi amp Yama-gishi 1994) In Japan networks of mutuallycommitted relations play a more important rolein governing exchange relationships than theydo in the United States A cross-national ques-tionnaire that included 1136 Japanese and 501American respondents showed that the Ameri-can respondents were more trusting of otherpeople in general and considered reputation moreimportant In contrast the Japanese respondentsplaced a higher value on exchange relationshipsthat are based on personal relationships Oneexplanation for this is that the extreme socialstability of mutually committed relationships in Japan reduces uncertainty about transactionsand diminishes the role of trust (Doney et al1998)

More generally cultural differences associat-ed with power distance (eg dependence onauthority and respect for authoritarian norms)uncertainty avoidance and individualist andcollectivist attitudes can influence the develop-ment and role of trust (Doney et al 1998) Cul-tural variation can influence trust in an on-lineenvironment Karvonen (2001) compared trustin E-commerce service among consumers fromFinland Sweden and Iceland Although alltrusted a simple design there were substantial

58 Spring 2004 ndash Human Factors

differences in the initial trust in a complexdesign with Finnish customers being the mostwary and Icelandic customers the most trustingMore generally the level of social trust variessubstantially among countries and this variationaccounts for over 64 of the variance in the lev-el of Internet adoption (Huang Keser Lelandamp Shachat 2002) The culturally dependentnature of trust suggests that findings regardingtrust in automation may need to be verifiedwhen they are extrapolated from one culturalsetting to another

Cultural context can also describe systematicdifferences in groups of workers For exam-ple in the context of trust in automation Riley(1996) found that pilots who are accustomedto using automation trusted and relied on auto-mation more than did students who were notas familiar with automation Likewise in fieldstudies of operators adapting to computerizedmanufacturing systems Zuboff (1988) foundthat the culture associated with those operatorswho had been exposed to computer systems ledto greater trust and acceptance of automationThese results show that trust in automation liketrust in people is culturally influenced in thatit depends on the long-term experiences of agroup of people

Trust between people depends on the individ-ual organizational and cultural context Thiscontext influences trust because it affects initiallevels of trust and how people interpret infor-mation regarding the agent Some evidenceshows that these relationships may also hold fortrust in automation however very little researchhas investigated any of these factors We ad-dress the types of information that form thebasis of trust in the next section

Basis of Trust Levels of Detail and ofAttributional Abstraction

Trust is a multidimensional construct that isbased on trustee characteristics such as motivesintentions and actions (Bhattacharya Devinneyamp Pillutla 1998 Mayer et al 1995) The basisfor trust is the information that informs the per-son about the ability of the trustee to achievethe goals of the trustor Two critical elementsdefine the basis of trust The first is the focusof trust What is to be trusted The second isthe type of information that describes the entity

to be trusted What is the information support-ing trust This information guides expectationsregarding how well the trustee can achieve thetrustorrsquos goals The focus of trust can be de-scribed according to levels of detail and theinformation supporting trust can be describedaccording to three levels of attributional ab-straction

The focus of trust can be considered along adimension defined by the level of detail whichvaries from general trust of institutions andorganizations to trust in a specific agent Withautomation this might correspond to trust in anoverall system of automation as compared withtrust in a particular mode of an automatic con-troller Couch and Jones (1997) considered threelevels of trust ndash generalized network and part-ner trust ndash which are similar to the three levelsidentified by McKnight et al (1998) Generaltrust corresponds to trust in human nature andinstitutions (Barber 1983) Network trust refersto the cluster of acquaintances that constitutesa personrsquos social network whereas partner trustcorresponds to trust in specific persons (Rempelet al 1985)

There does not seem to be a reliable rela-tionship between global and specific trust sug-gesting that this dimension makes importantdistinctions regarding the evolution of trust(Couch amp Jones 1997) For example trust in asupervisor and trust in an organization are dis-tinct and depend on qualitatively different fac-tors (Tan amp Tan 2000) Trust in a supervisordepends on perceived ability integrity and be-nevolence whereas trust in the organizationdepends on more distal variables of perceivedorganizational support and justice General trustis frequently considered to be a personality traitwhereas specific trust is frequently viewed as anoutcome of a specific interaction (Couch ampJones 1997) Several studies show howeverthat general and specific trust are not necessarilytied to the ldquotrait and staterdquo distinctions and mayinstead simply refer to different levels of detail(Sitkin amp Roth 1993) Developing a high degreeof functional specificity in the trust of automa-tion may require specific information for eachlevel of detail of the automation

The basis of trust can be considered along adimension of attributional abstraction whichvaries from demonstrations of competence to

TRUST IN AUTOMATION 59

the intentions of the agent A recent review oftrust literature concluded that three general lev-els summarize the bases of trust ability integrityand benevolence (Mayer et al 1995) Ability isthe group of skills competencies and character-istics that enable the trustee to influence the do-main Integrity is the degree to which the trusteeadheres to a set of principles the trustor findsacceptable Benevolence is the extent to whichthe intents and motivations of the trustee arealigned with those of the trustor

In the context of interpersonal relationshipsRempel et al (1985) viewed trust as an evolvingphenomenon with the basis of trust changing asthe relationship progressed They argued thatpredictability the degree to which future be-havior can be anticipated (and which is similarto ability) forms the basis of trust early in arelationship This is followed by dependabilitywhich is the degree to which behavior is con-sistent and is similar to integrity As the rela-tionship matures the basis of trust ultimatelyshifts to faith which is a more general judgmentthat a person can be relied upon and is similarto benevolence A similar progression emergedin a study of operatorsrsquo adaptation to new tech-nology (Zuboff 1988) Trust in that context de-pended on trial-and-error experience followedby understanding of the technologyrsquos operationand finally faith Lee and Moray (1992) madesimilar distinctions in defining the factors thatinfluence trust in automation They identifiedperformance process and purpose as the gen-eral bases of trust

Performance refers to the current and histor-ical operation of the automation and includescharacteristics such as reliability predictabilityand ability Performance information describeswhat the automation does More specificallyperformance refers to the competency or exper-tise as demonstrated by its ability to achievethe operatorrsquos goals Because performance islinked to the ability to achieve specific goals itdemonstrates the task- and situation-dependentnature of trust This is similar to Sheridanrsquos(1992) concept of robustness as a basis for trustin human-automation relationships The operatorwill tend to trust automation that performs in amanner that reliably achieves his or her goals

Process is the degree to which the automa-tionrsquos algorithms are appropriate for the situation

and able to achieve the operatorrsquos goals Pro-cess information describes how the automationoperates In interpersonal relationships thiscorresponds to the consistency of actions associ-ated with adherence to a set of acceptable prin-ciples (Mayer et al 1995) Process as a basisfor trust reflects a shift away from focus on spe-cific behaviors and toward qualities and char-acteristics attributed to an agent With processtrust is in the agent and not in the specificactions of the agent Because of this the processbasis of trust relies on dispositional attributionsand inferences drawn from the performance ofthe agent In this sense process is similar todependability and integrity Openness the will-ingness to give and receive ideas is anotherelement of the process basis of trust Interesting-ly consistency and openness are more importantfor trust in peers than for trust in supervisors orsubordinates (Schindler amp Thomas 1993)

In the context of automation the processbasis of trust refers to the algorithms and oper-ations that govern the behavior of the auto-mation This concept is similar to Sheridanrsquos(1992) concept of understandability in human-automation relationships The operator willtend to trust the automation if its algorithmscan be understood and seem capable of achiev-ing the operatorrsquos goals in the current situation

Purpose refers to the degree to which theautomation is being used within the realm ofthe designerrsquos intent Purpose describes why theautomation was developed Purpose correspondsto faith and benevolence and reflects the percep-tion that the trustee has a positive orientationtoward the trustor With interpersonal relation-ships the perception of such a positive orienta-tion depends on the intentions and motives ofthe trustee This can take the form of abstractgeneralized value congruence (Sitkin amp Roth1993) which can be described as whether andto what extent the trustee has a motive to lie(Hovland Janis amp Kelly 1953) The purposebasis of trust reflects the attribution of thesecharacteristics to the automation Frequentlywhether or not this attribution takes place willdepend on whether the designerrsquos intent has beencommunicated to the operator If so the opera-tor will tend to trust automation to achieve thegoals it was designed to achieve

Table 1 builds on the work of Mayer et al

60 Spring 2004 ndash Human Factors

TABLE 1 Summary of the Dimensions That Describe the Basis of Trust

Study Basis of Trust Summary Dimension

Barber (1983) Competence PerformancePersistence ProcessFiduciary responsibility Purpose

Butler amp Cantrell (1984) Competence PerformanceIntegrity ProcessConsistency ProcessLoyalty PurposeOpenness Process

Cook amp Wall (1980) Ability PerformanceIntentions Purpose

Deutsch (1960) Ability PerformanceIntentions Purpose

Gabarro (1978) Integrity ProcessMotives PurposeOpenness ProcessDiscreetness ProcessFunctionalspecific competence PerformanceInterpersonal competence PerformanceBusiness sense PerformanceJudgment Performance

Hovland Janis amp Kelly (1953) Expertise PerformanceMotivation to lie Purpose

Jennings (1967) Loyalty PurposePredictability ProcessAccessibility ProcessAvailability Process

Kee amp Knox (1970) Competence PerformanceMotives Purpose

Mayer et al (1995) Ability PerformanceIntegrity ProcessBenevolence Purpose

Mishra (1996) Competency PerformanceReliability PerformanceOpenness ProcessConcern Purpose

Moorman et al (1993) Integrity ProcessWillingness to reduce uncertainty ProcessConfidentiality ProcessExpertise PerformanceTactfulness ProcessSincerity ProcessCongeniality PerformanceTimeliness Performance

Rempel et al (1985) Reliability PerformanceDependability ProcessFaith Purpose

Sitkin amp Roth (1993) Context-specific reliability PerformanceGeneralized value congruence Purpose

Zuboff (1988) Trial and error experience PerformanceUnderstanding ProcessLeap of faith Purpose

TRUST IN AUTOMATION 61

(1995) and shows how the many characteristicsof a trustee that affect trust can be summarizedby the three bases of performance process andpurpose As an example Mishra (1996) com-bined a literature review and interviews with33 managers to identify four dimensions oftrust These dimensions can be linked to thethree bases of trust competency (performance)reliability (performance) openness (process) andconcern (purpose) The dimensions of purposeprocess and performance provide a concise setof distinctions that describe the basis of trustacross a wide range of application domainsThese dimensions clearly identify three typesof goal-oriented information that contribute todeveloping an appropriate level of trust

Trust depends not only on the observationsassociated with these three dimensions but alsoon the inferences drawn among them Observa-tion of performance can support inferences re-garding internal mechanisms associated withthe dimension of process analysis of internalmechanisms can support inferences regardingthe designerrsquos intent associated with the dimen-sion of purpose Likewise the underlying pro-cess can be inferred from knowledge of thedesignerrsquos intent and performance can be esti-mated from an understanding of the underlyingprocess If these inferences support trust it fol-lows that a system design that facilitates themwould promote a more appropriate level of trustIf inferences are inconsistent with observationsthen trust will probably suffer because of apoor correspondence with the observed agentSimilarly if inferences between levels are incon-sistent trust will suffer because of poor coher-ence For example inconsistency between theintentions conveyed by the manager (purposebasis for trust) and the managerrsquos actions (perfor-mance basis of trust) have a particularly negativeeffect on trust (Gabarro 1978) Likewise the ef-fort to demonstrate reliability and encouragetrust by enforcing procedural approaches maynot succeed if value-oriented concerns (egpurpose) are ignored (Sitkin amp Roth 1993)

In conditions where trust is based on severalfactors it can be quite stable or robust in situ-ations where trust depends on a single basis itcan be quite volatile or fragile (McKnight et al1998) Trust that is based on an understandingof the motives of the agent will be less fragile

than trust based only on the reliability of theagentrsquos performance (Rempel et al 1985)These findings help explain why trust in auto-mation can sometimes appear quite robust andat other times quite fragile (Kantowitz Hanow-ski amp Kantowitz 1997b) Both imperfect co-herence between dimensions and imperfectcorrespondence with observations are likely toundermine trust and are critical considerationsfor encouraging appropriate levels of trust

The availability of information at the differ-ent levels of detail and attributional abstractionmay lead to high calibration high resolutionand great temporal and functional specificity oftrust Designing interfaces and training to pro-vide operators with information regarding thepurpose process and performance of automa-tion could enhance the appropriateness of trustHowever the mere availability of informationwill not ensure appropriate trust The informa-tion must be presented in a manner that is con-sistent with the cognitive processes underlyingthe development of trust as described in thefollowing section

Analytic Analogical and AffectiveProcesses Governing Trust

Information that forms the basis of trust canbe assimilated in three qualitatively differentways Some argue that trust is based on an ana-lytic process others argue that trust dependson an analogical process of assessing categorymembership Most importantly trust also seemsto depend on an affective response to the vio-lation or confirmation of implicit expectan-cies Figure 3 shows how these processes mayinteract to influence trust Ultimately trust is anaffective response but it is also influenced byanalytic and analogical processes The bold linesin Figure 3 show that the affective process hasa greater influence on the analytic process thanthe analytic has on the affective (LoewensteinWeber Hsee amp Welch 2001) The roles of ana-lytic analogical and affective processes dependon the evolution of the relationship betweenthe trustor and trustee the information avail-able to the trustor and the way the informationis displayed

A dominant approach to trust in the organi-zational sciences considers trust from a rationalchoice perspective (Hardin 1992) This view

62 Spring 2004 ndash Human Factors

suggests that trust depends on an evaluationmade under uncertainty in which decision mak-ers use their knowledge of the motivations andinterests of the other party to maximize gainsand minimize losses Individuals are assumed tomake choices based on a rationally derived as-sessment of costs and benefits (Lewicki amp Bun-ker 1996) Williamson (1993) argued that trustis primarily analytic in business and commercialrelations and that it represents one aspect ofrational risk analysis Trust accumulates and dis-sipates based on the effect of cumulative inter-actions with the other agent These interactionslead to an accumulation of knowledge that es-tablishes trust through an expected value cal-culation in which the probabilities of variousoutcomes are estimated with increasing experi-ence (Holmes 1991) Trust can also depend onan analysis of how organizational and individualconstraints affect performance For exampletrust can be considered in terms of a rationalargument in which grounds for various conclu-sions are developed and defended according tothe evidence (Cohen 2000)

These approaches provide a useful evaluationof the normative level of trust but they may notdescribe how trust actually develops Analyticapproaches to trust probably overstate cognitivecapacities and understate the influence of affectand role of strategies to accommodate the lim-its of bounded rationality similar to normative

decision-making theories (Janis amp Mann 1977Simon 1957) Descriptive accounts of decisionmaking show that expert decision makers sel-dom engage in conscious calculations or in anexhaustive comparison of alternatives (Cross-man 1974 Klein 1989) The analytic processis similar to knowledge-based performance asdescribed by Rasmussen (1983) in which infor-mation is processed and plans are formulatedand evaluated using a function-based mentalmodel of the system Knowledge-based or ana-lytic processes are probably complemented byless cognitively demanding processes

A less cognitively demanding process is fortrust to develop according to analogical judg-ments that link levels of trust to characteristicsof the agent and environmental context Thesecategories develop through direct observationof the trusted agent intermediaries that conveytheir observations and presumptions based onstandards category membership and procedures(Kramer 1999)

Explicit and implicit rules frequently guideindividual and collective collaborative behavior(Mishra 1996) Sitkin and Roth (1993) illus-trated the importance of rules in their distinctionbetween trust as an institutional arrangementand interpersonal trust Institutional trust de-pends on contracts legal procedures and for-mal rules whereas interpersonal trust is basedon shared values The application of rules in

Figure 3 The interplay among the analytic analogical and affective process underlying trust with the analyticand analogical processes both contributing to the affective process and the affective process also having adominant influence on the analytic and analogical processes

TRUST IN AUTOMATION 63

institution-based trust reflects guarantees andsafety nets based on the organizational con-straints of regulations and legal recourses thatconstrain behavior and make it more possibleto trust (McKnight et al 1998) However whenfundamental values which frequently supportinterpersonal trust are violated legalistic ap-proaches and applications of rules to restoretrust are ineffective and can further underminetrust (Sitkin amp Roth 1993) When rules areconsistent with trustworthy behavior they canincrease peoplersquos expectation of satisfactoryperformance during times of normal operationby pairing situations with rule-based expecta-tions but rules can have a negative effect whensituations diverge from normal operation

Analogical trust can also depend on interme-diaries who can convey information to supportjudgments of trust (Burt amp Knez 1996) suchas reputations and gossip that enable individualsto develop trust without any direct contact Inthe context of trust in automation responsetimes to warnings tend to increase when falsealarms occur This effect was counteracted bygossip that suggested that the rate of false alarmswas lower than it actually was (Bliss Dunn ampFuller 1995) Similarly trust can be based onpresumptions of the trustworthiness of a cate-gory or role of the person rather than on theactual performance (Kramer Brewer amp Hanna1996) If trust is primarily based on rules thatcharacterize performance during normal situa-tions then abnormal situations or erroneouscategory membership might lead to the collapseof trust because the assurances associated withnormal or customary operation are no longervalid Because of this category-based trust canbe fragile particularly when abnormal situationsdisrupt otherwise stable roles For example whenroles were defined by a binding contract trust de-clined substantially when the contracts wereremoved (Malhotra amp Murnighan 2002)

Similar effects may occur with trust in auto-mation Specifically Miller (2002) suggested thatcomputer etiquette may have an important influ-ence on human-computer interaction Etiquettemay influence trust because category member-ship associated with adherence to a particularetiquette helps people to infer how the automa-tion will perform Intentionally developing com-puter etiquette could promote appropriate trust

but it could lead to inappropriate trust if peopleinfer inappropriate category memberships anddevelop distorted expectations regarding thecapability of the automation

These studies suggest that trust in automa-tion may depend on category judgments basedon a variety of sources that include direct andindirect interaction with the automation Be-cause these analogical judgments emerge fromcomplex interactions between people proce-dures and prolonged experience data from lab-oratory experiments may need to be carefullyassessed to determine how well they generalizeto actual operational settings In this way theanalogical process governing trust correspondsvery closely to Rasmussenrsquos (1983) concept ofrule-based behavior in which condition-actionpairings or procedures guide behavior

Analytic and analogical processes do not ac-count for the affective aspects of trust whichrepresent the core influence of trust on behav-ior (Kramer 1999) Emotional responses arecritical because people not only think abouttrust they also feel it (Fine amp Holyfield 1996)Emotions fluctuate over time according to theperformance of the trustee and they can signalinstances where expectations do not conform tothe ongoing experience As trust is betrayedemotions provide signals concerning the chang-ing nature of the situation (Jones amp George1998)

Berscheid (1994) used the concept of auto-matic processing to describe how behaviors thatare relevant to frequently accessed trait con-structs such as trust are likely to be perceivedunder attentional overload situations Becauseof this people process observations of new be-havior according to old constructs withoutalways being aware of doing so Automaticprocessing plays a substantial role in attribu-tional activities with many aspects of causalreasoning occurring outside conscious aware-ness In this way emotions bridge the gaps inrationality and enable people to focus theirlimited attentional capacity (Johnson-Laird ampOately 1992 Loewenstein et al 2001 Simon1967) Because the cognitive complexity ofrelationships can exceed a personrsquos capacity toform a complete mental model people cannotperfectly predict behavior and emotions serveto redistribute cognitive resources and manage

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 9: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

58 Spring 2004 ndash Human Factors

differences in the initial trust in a complexdesign with Finnish customers being the mostwary and Icelandic customers the most trustingMore generally the level of social trust variessubstantially among countries and this variationaccounts for over 64 of the variance in the lev-el of Internet adoption (Huang Keser Lelandamp Shachat 2002) The culturally dependentnature of trust suggests that findings regardingtrust in automation may need to be verifiedwhen they are extrapolated from one culturalsetting to another

Cultural context can also describe systematicdifferences in groups of workers For exam-ple in the context of trust in automation Riley(1996) found that pilots who are accustomedto using automation trusted and relied on auto-mation more than did students who were notas familiar with automation Likewise in fieldstudies of operators adapting to computerizedmanufacturing systems Zuboff (1988) foundthat the culture associated with those operatorswho had been exposed to computer systems ledto greater trust and acceptance of automationThese results show that trust in automation liketrust in people is culturally influenced in thatit depends on the long-term experiences of agroup of people

Trust between people depends on the individ-ual organizational and cultural context Thiscontext influences trust because it affects initiallevels of trust and how people interpret infor-mation regarding the agent Some evidenceshows that these relationships may also hold fortrust in automation however very little researchhas investigated any of these factors We ad-dress the types of information that form thebasis of trust in the next section

Basis of Trust Levels of Detail and ofAttributional Abstraction

Trust is a multidimensional construct that isbased on trustee characteristics such as motivesintentions and actions (Bhattacharya Devinneyamp Pillutla 1998 Mayer et al 1995) The basisfor trust is the information that informs the per-son about the ability of the trustee to achievethe goals of the trustor Two critical elementsdefine the basis of trust The first is the focusof trust What is to be trusted The second isthe type of information that describes the entity

to be trusted What is the information support-ing trust This information guides expectationsregarding how well the trustee can achieve thetrustorrsquos goals The focus of trust can be de-scribed according to levels of detail and theinformation supporting trust can be describedaccording to three levels of attributional ab-straction

The focus of trust can be considered along adimension defined by the level of detail whichvaries from general trust of institutions andorganizations to trust in a specific agent Withautomation this might correspond to trust in anoverall system of automation as compared withtrust in a particular mode of an automatic con-troller Couch and Jones (1997) considered threelevels of trust ndash generalized network and part-ner trust ndash which are similar to the three levelsidentified by McKnight et al (1998) Generaltrust corresponds to trust in human nature andinstitutions (Barber 1983) Network trust refersto the cluster of acquaintances that constitutesa personrsquos social network whereas partner trustcorresponds to trust in specific persons (Rempelet al 1985)

There does not seem to be a reliable rela-tionship between global and specific trust sug-gesting that this dimension makes importantdistinctions regarding the evolution of trust(Couch amp Jones 1997) For example trust in asupervisor and trust in an organization are dis-tinct and depend on qualitatively different fac-tors (Tan amp Tan 2000) Trust in a supervisordepends on perceived ability integrity and be-nevolence whereas trust in the organizationdepends on more distal variables of perceivedorganizational support and justice General trustis frequently considered to be a personality traitwhereas specific trust is frequently viewed as anoutcome of a specific interaction (Couch ampJones 1997) Several studies show howeverthat general and specific trust are not necessarilytied to the ldquotrait and staterdquo distinctions and mayinstead simply refer to different levels of detail(Sitkin amp Roth 1993) Developing a high degreeof functional specificity in the trust of automa-tion may require specific information for eachlevel of detail of the automation

The basis of trust can be considered along adimension of attributional abstraction whichvaries from demonstrations of competence to

TRUST IN AUTOMATION 59

the intentions of the agent A recent review oftrust literature concluded that three general lev-els summarize the bases of trust ability integrityand benevolence (Mayer et al 1995) Ability isthe group of skills competencies and character-istics that enable the trustee to influence the do-main Integrity is the degree to which the trusteeadheres to a set of principles the trustor findsacceptable Benevolence is the extent to whichthe intents and motivations of the trustee arealigned with those of the trustor

In the context of interpersonal relationshipsRempel et al (1985) viewed trust as an evolvingphenomenon with the basis of trust changing asthe relationship progressed They argued thatpredictability the degree to which future be-havior can be anticipated (and which is similarto ability) forms the basis of trust early in arelationship This is followed by dependabilitywhich is the degree to which behavior is con-sistent and is similar to integrity As the rela-tionship matures the basis of trust ultimatelyshifts to faith which is a more general judgmentthat a person can be relied upon and is similarto benevolence A similar progression emergedin a study of operatorsrsquo adaptation to new tech-nology (Zuboff 1988) Trust in that context de-pended on trial-and-error experience followedby understanding of the technologyrsquos operationand finally faith Lee and Moray (1992) madesimilar distinctions in defining the factors thatinfluence trust in automation They identifiedperformance process and purpose as the gen-eral bases of trust

Performance refers to the current and histor-ical operation of the automation and includescharacteristics such as reliability predictabilityand ability Performance information describeswhat the automation does More specificallyperformance refers to the competency or exper-tise as demonstrated by its ability to achievethe operatorrsquos goals Because performance islinked to the ability to achieve specific goals itdemonstrates the task- and situation-dependentnature of trust This is similar to Sheridanrsquos(1992) concept of robustness as a basis for trustin human-automation relationships The operatorwill tend to trust automation that performs in amanner that reliably achieves his or her goals

Process is the degree to which the automa-tionrsquos algorithms are appropriate for the situation

and able to achieve the operatorrsquos goals Pro-cess information describes how the automationoperates In interpersonal relationships thiscorresponds to the consistency of actions associ-ated with adherence to a set of acceptable prin-ciples (Mayer et al 1995) Process as a basisfor trust reflects a shift away from focus on spe-cific behaviors and toward qualities and char-acteristics attributed to an agent With processtrust is in the agent and not in the specificactions of the agent Because of this the processbasis of trust relies on dispositional attributionsand inferences drawn from the performance ofthe agent In this sense process is similar todependability and integrity Openness the will-ingness to give and receive ideas is anotherelement of the process basis of trust Interesting-ly consistency and openness are more importantfor trust in peers than for trust in supervisors orsubordinates (Schindler amp Thomas 1993)

In the context of automation the processbasis of trust refers to the algorithms and oper-ations that govern the behavior of the auto-mation This concept is similar to Sheridanrsquos(1992) concept of understandability in human-automation relationships The operator willtend to trust the automation if its algorithmscan be understood and seem capable of achiev-ing the operatorrsquos goals in the current situation

Purpose refers to the degree to which theautomation is being used within the realm ofthe designerrsquos intent Purpose describes why theautomation was developed Purpose correspondsto faith and benevolence and reflects the percep-tion that the trustee has a positive orientationtoward the trustor With interpersonal relation-ships the perception of such a positive orienta-tion depends on the intentions and motives ofthe trustee This can take the form of abstractgeneralized value congruence (Sitkin amp Roth1993) which can be described as whether andto what extent the trustee has a motive to lie(Hovland Janis amp Kelly 1953) The purposebasis of trust reflects the attribution of thesecharacteristics to the automation Frequentlywhether or not this attribution takes place willdepend on whether the designerrsquos intent has beencommunicated to the operator If so the opera-tor will tend to trust automation to achieve thegoals it was designed to achieve

Table 1 builds on the work of Mayer et al

60 Spring 2004 ndash Human Factors

TABLE 1 Summary of the Dimensions That Describe the Basis of Trust

Study Basis of Trust Summary Dimension

Barber (1983) Competence PerformancePersistence ProcessFiduciary responsibility Purpose

Butler amp Cantrell (1984) Competence PerformanceIntegrity ProcessConsistency ProcessLoyalty PurposeOpenness Process

Cook amp Wall (1980) Ability PerformanceIntentions Purpose

Deutsch (1960) Ability PerformanceIntentions Purpose

Gabarro (1978) Integrity ProcessMotives PurposeOpenness ProcessDiscreetness ProcessFunctionalspecific competence PerformanceInterpersonal competence PerformanceBusiness sense PerformanceJudgment Performance

Hovland Janis amp Kelly (1953) Expertise PerformanceMotivation to lie Purpose

Jennings (1967) Loyalty PurposePredictability ProcessAccessibility ProcessAvailability Process

Kee amp Knox (1970) Competence PerformanceMotives Purpose

Mayer et al (1995) Ability PerformanceIntegrity ProcessBenevolence Purpose

Mishra (1996) Competency PerformanceReliability PerformanceOpenness ProcessConcern Purpose

Moorman et al (1993) Integrity ProcessWillingness to reduce uncertainty ProcessConfidentiality ProcessExpertise PerformanceTactfulness ProcessSincerity ProcessCongeniality PerformanceTimeliness Performance

Rempel et al (1985) Reliability PerformanceDependability ProcessFaith Purpose

Sitkin amp Roth (1993) Context-specific reliability PerformanceGeneralized value congruence Purpose

Zuboff (1988) Trial and error experience PerformanceUnderstanding ProcessLeap of faith Purpose

TRUST IN AUTOMATION 61

(1995) and shows how the many characteristicsof a trustee that affect trust can be summarizedby the three bases of performance process andpurpose As an example Mishra (1996) com-bined a literature review and interviews with33 managers to identify four dimensions oftrust These dimensions can be linked to thethree bases of trust competency (performance)reliability (performance) openness (process) andconcern (purpose) The dimensions of purposeprocess and performance provide a concise setof distinctions that describe the basis of trustacross a wide range of application domainsThese dimensions clearly identify three typesof goal-oriented information that contribute todeveloping an appropriate level of trust

Trust depends not only on the observationsassociated with these three dimensions but alsoon the inferences drawn among them Observa-tion of performance can support inferences re-garding internal mechanisms associated withthe dimension of process analysis of internalmechanisms can support inferences regardingthe designerrsquos intent associated with the dimen-sion of purpose Likewise the underlying pro-cess can be inferred from knowledge of thedesignerrsquos intent and performance can be esti-mated from an understanding of the underlyingprocess If these inferences support trust it fol-lows that a system design that facilitates themwould promote a more appropriate level of trustIf inferences are inconsistent with observationsthen trust will probably suffer because of apoor correspondence with the observed agentSimilarly if inferences between levels are incon-sistent trust will suffer because of poor coher-ence For example inconsistency between theintentions conveyed by the manager (purposebasis for trust) and the managerrsquos actions (perfor-mance basis of trust) have a particularly negativeeffect on trust (Gabarro 1978) Likewise the ef-fort to demonstrate reliability and encouragetrust by enforcing procedural approaches maynot succeed if value-oriented concerns (egpurpose) are ignored (Sitkin amp Roth 1993)

In conditions where trust is based on severalfactors it can be quite stable or robust in situ-ations where trust depends on a single basis itcan be quite volatile or fragile (McKnight et al1998) Trust that is based on an understandingof the motives of the agent will be less fragile

than trust based only on the reliability of theagentrsquos performance (Rempel et al 1985)These findings help explain why trust in auto-mation can sometimes appear quite robust andat other times quite fragile (Kantowitz Hanow-ski amp Kantowitz 1997b) Both imperfect co-herence between dimensions and imperfectcorrespondence with observations are likely toundermine trust and are critical considerationsfor encouraging appropriate levels of trust

The availability of information at the differ-ent levels of detail and attributional abstractionmay lead to high calibration high resolutionand great temporal and functional specificity oftrust Designing interfaces and training to pro-vide operators with information regarding thepurpose process and performance of automa-tion could enhance the appropriateness of trustHowever the mere availability of informationwill not ensure appropriate trust The informa-tion must be presented in a manner that is con-sistent with the cognitive processes underlyingthe development of trust as described in thefollowing section

Analytic Analogical and AffectiveProcesses Governing Trust

Information that forms the basis of trust canbe assimilated in three qualitatively differentways Some argue that trust is based on an ana-lytic process others argue that trust dependson an analogical process of assessing categorymembership Most importantly trust also seemsto depend on an affective response to the vio-lation or confirmation of implicit expectan-cies Figure 3 shows how these processes mayinteract to influence trust Ultimately trust is anaffective response but it is also influenced byanalytic and analogical processes The bold linesin Figure 3 show that the affective process hasa greater influence on the analytic process thanthe analytic has on the affective (LoewensteinWeber Hsee amp Welch 2001) The roles of ana-lytic analogical and affective processes dependon the evolution of the relationship betweenthe trustor and trustee the information avail-able to the trustor and the way the informationis displayed

A dominant approach to trust in the organi-zational sciences considers trust from a rationalchoice perspective (Hardin 1992) This view

62 Spring 2004 ndash Human Factors

suggests that trust depends on an evaluationmade under uncertainty in which decision mak-ers use their knowledge of the motivations andinterests of the other party to maximize gainsand minimize losses Individuals are assumed tomake choices based on a rationally derived as-sessment of costs and benefits (Lewicki amp Bun-ker 1996) Williamson (1993) argued that trustis primarily analytic in business and commercialrelations and that it represents one aspect ofrational risk analysis Trust accumulates and dis-sipates based on the effect of cumulative inter-actions with the other agent These interactionslead to an accumulation of knowledge that es-tablishes trust through an expected value cal-culation in which the probabilities of variousoutcomes are estimated with increasing experi-ence (Holmes 1991) Trust can also depend onan analysis of how organizational and individualconstraints affect performance For exampletrust can be considered in terms of a rationalargument in which grounds for various conclu-sions are developed and defended according tothe evidence (Cohen 2000)

These approaches provide a useful evaluationof the normative level of trust but they may notdescribe how trust actually develops Analyticapproaches to trust probably overstate cognitivecapacities and understate the influence of affectand role of strategies to accommodate the lim-its of bounded rationality similar to normative

decision-making theories (Janis amp Mann 1977Simon 1957) Descriptive accounts of decisionmaking show that expert decision makers sel-dom engage in conscious calculations or in anexhaustive comparison of alternatives (Cross-man 1974 Klein 1989) The analytic processis similar to knowledge-based performance asdescribed by Rasmussen (1983) in which infor-mation is processed and plans are formulatedand evaluated using a function-based mentalmodel of the system Knowledge-based or ana-lytic processes are probably complemented byless cognitively demanding processes

A less cognitively demanding process is fortrust to develop according to analogical judg-ments that link levels of trust to characteristicsof the agent and environmental context Thesecategories develop through direct observationof the trusted agent intermediaries that conveytheir observations and presumptions based onstandards category membership and procedures(Kramer 1999)

Explicit and implicit rules frequently guideindividual and collective collaborative behavior(Mishra 1996) Sitkin and Roth (1993) illus-trated the importance of rules in their distinctionbetween trust as an institutional arrangementand interpersonal trust Institutional trust de-pends on contracts legal procedures and for-mal rules whereas interpersonal trust is basedon shared values The application of rules in

Figure 3 The interplay among the analytic analogical and affective process underlying trust with the analyticand analogical processes both contributing to the affective process and the affective process also having adominant influence on the analytic and analogical processes

TRUST IN AUTOMATION 63

institution-based trust reflects guarantees andsafety nets based on the organizational con-straints of regulations and legal recourses thatconstrain behavior and make it more possibleto trust (McKnight et al 1998) However whenfundamental values which frequently supportinterpersonal trust are violated legalistic ap-proaches and applications of rules to restoretrust are ineffective and can further underminetrust (Sitkin amp Roth 1993) When rules areconsistent with trustworthy behavior they canincrease peoplersquos expectation of satisfactoryperformance during times of normal operationby pairing situations with rule-based expecta-tions but rules can have a negative effect whensituations diverge from normal operation

Analogical trust can also depend on interme-diaries who can convey information to supportjudgments of trust (Burt amp Knez 1996) suchas reputations and gossip that enable individualsto develop trust without any direct contact Inthe context of trust in automation responsetimes to warnings tend to increase when falsealarms occur This effect was counteracted bygossip that suggested that the rate of false alarmswas lower than it actually was (Bliss Dunn ampFuller 1995) Similarly trust can be based onpresumptions of the trustworthiness of a cate-gory or role of the person rather than on theactual performance (Kramer Brewer amp Hanna1996) If trust is primarily based on rules thatcharacterize performance during normal situa-tions then abnormal situations or erroneouscategory membership might lead to the collapseof trust because the assurances associated withnormal or customary operation are no longervalid Because of this category-based trust canbe fragile particularly when abnormal situationsdisrupt otherwise stable roles For example whenroles were defined by a binding contract trust de-clined substantially when the contracts wereremoved (Malhotra amp Murnighan 2002)

Similar effects may occur with trust in auto-mation Specifically Miller (2002) suggested thatcomputer etiquette may have an important influ-ence on human-computer interaction Etiquettemay influence trust because category member-ship associated with adherence to a particularetiquette helps people to infer how the automa-tion will perform Intentionally developing com-puter etiquette could promote appropriate trust

but it could lead to inappropriate trust if peopleinfer inappropriate category memberships anddevelop distorted expectations regarding thecapability of the automation

These studies suggest that trust in automa-tion may depend on category judgments basedon a variety of sources that include direct andindirect interaction with the automation Be-cause these analogical judgments emerge fromcomplex interactions between people proce-dures and prolonged experience data from lab-oratory experiments may need to be carefullyassessed to determine how well they generalizeto actual operational settings In this way theanalogical process governing trust correspondsvery closely to Rasmussenrsquos (1983) concept ofrule-based behavior in which condition-actionpairings or procedures guide behavior

Analytic and analogical processes do not ac-count for the affective aspects of trust whichrepresent the core influence of trust on behav-ior (Kramer 1999) Emotional responses arecritical because people not only think abouttrust they also feel it (Fine amp Holyfield 1996)Emotions fluctuate over time according to theperformance of the trustee and they can signalinstances where expectations do not conform tothe ongoing experience As trust is betrayedemotions provide signals concerning the chang-ing nature of the situation (Jones amp George1998)

Berscheid (1994) used the concept of auto-matic processing to describe how behaviors thatare relevant to frequently accessed trait con-structs such as trust are likely to be perceivedunder attentional overload situations Becauseof this people process observations of new be-havior according to old constructs withoutalways being aware of doing so Automaticprocessing plays a substantial role in attribu-tional activities with many aspects of causalreasoning occurring outside conscious aware-ness In this way emotions bridge the gaps inrationality and enable people to focus theirlimited attentional capacity (Johnson-Laird ampOately 1992 Loewenstein et al 2001 Simon1967) Because the cognitive complexity ofrelationships can exceed a personrsquos capacity toform a complete mental model people cannotperfectly predict behavior and emotions serveto redistribute cognitive resources and manage

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 10: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 59

the intentions of the agent A recent review oftrust literature concluded that three general lev-els summarize the bases of trust ability integrityand benevolence (Mayer et al 1995) Ability isthe group of skills competencies and character-istics that enable the trustee to influence the do-main Integrity is the degree to which the trusteeadheres to a set of principles the trustor findsacceptable Benevolence is the extent to whichthe intents and motivations of the trustee arealigned with those of the trustor

In the context of interpersonal relationshipsRempel et al (1985) viewed trust as an evolvingphenomenon with the basis of trust changing asthe relationship progressed They argued thatpredictability the degree to which future be-havior can be anticipated (and which is similarto ability) forms the basis of trust early in arelationship This is followed by dependabilitywhich is the degree to which behavior is con-sistent and is similar to integrity As the rela-tionship matures the basis of trust ultimatelyshifts to faith which is a more general judgmentthat a person can be relied upon and is similarto benevolence A similar progression emergedin a study of operatorsrsquo adaptation to new tech-nology (Zuboff 1988) Trust in that context de-pended on trial-and-error experience followedby understanding of the technologyrsquos operationand finally faith Lee and Moray (1992) madesimilar distinctions in defining the factors thatinfluence trust in automation They identifiedperformance process and purpose as the gen-eral bases of trust

Performance refers to the current and histor-ical operation of the automation and includescharacteristics such as reliability predictabilityand ability Performance information describeswhat the automation does More specificallyperformance refers to the competency or exper-tise as demonstrated by its ability to achievethe operatorrsquos goals Because performance islinked to the ability to achieve specific goals itdemonstrates the task- and situation-dependentnature of trust This is similar to Sheridanrsquos(1992) concept of robustness as a basis for trustin human-automation relationships The operatorwill tend to trust automation that performs in amanner that reliably achieves his or her goals

Process is the degree to which the automa-tionrsquos algorithms are appropriate for the situation

and able to achieve the operatorrsquos goals Pro-cess information describes how the automationoperates In interpersonal relationships thiscorresponds to the consistency of actions associ-ated with adherence to a set of acceptable prin-ciples (Mayer et al 1995) Process as a basisfor trust reflects a shift away from focus on spe-cific behaviors and toward qualities and char-acteristics attributed to an agent With processtrust is in the agent and not in the specificactions of the agent Because of this the processbasis of trust relies on dispositional attributionsand inferences drawn from the performance ofthe agent In this sense process is similar todependability and integrity Openness the will-ingness to give and receive ideas is anotherelement of the process basis of trust Interesting-ly consistency and openness are more importantfor trust in peers than for trust in supervisors orsubordinates (Schindler amp Thomas 1993)

In the context of automation the processbasis of trust refers to the algorithms and oper-ations that govern the behavior of the auto-mation This concept is similar to Sheridanrsquos(1992) concept of understandability in human-automation relationships The operator willtend to trust the automation if its algorithmscan be understood and seem capable of achiev-ing the operatorrsquos goals in the current situation

Purpose refers to the degree to which theautomation is being used within the realm ofthe designerrsquos intent Purpose describes why theautomation was developed Purpose correspondsto faith and benevolence and reflects the percep-tion that the trustee has a positive orientationtoward the trustor With interpersonal relation-ships the perception of such a positive orienta-tion depends on the intentions and motives ofthe trustee This can take the form of abstractgeneralized value congruence (Sitkin amp Roth1993) which can be described as whether andto what extent the trustee has a motive to lie(Hovland Janis amp Kelly 1953) The purposebasis of trust reflects the attribution of thesecharacteristics to the automation Frequentlywhether or not this attribution takes place willdepend on whether the designerrsquos intent has beencommunicated to the operator If so the opera-tor will tend to trust automation to achieve thegoals it was designed to achieve

Table 1 builds on the work of Mayer et al

60 Spring 2004 ndash Human Factors

TABLE 1 Summary of the Dimensions That Describe the Basis of Trust

Study Basis of Trust Summary Dimension

Barber (1983) Competence PerformancePersistence ProcessFiduciary responsibility Purpose

Butler amp Cantrell (1984) Competence PerformanceIntegrity ProcessConsistency ProcessLoyalty PurposeOpenness Process

Cook amp Wall (1980) Ability PerformanceIntentions Purpose

Deutsch (1960) Ability PerformanceIntentions Purpose

Gabarro (1978) Integrity ProcessMotives PurposeOpenness ProcessDiscreetness ProcessFunctionalspecific competence PerformanceInterpersonal competence PerformanceBusiness sense PerformanceJudgment Performance

Hovland Janis amp Kelly (1953) Expertise PerformanceMotivation to lie Purpose

Jennings (1967) Loyalty PurposePredictability ProcessAccessibility ProcessAvailability Process

Kee amp Knox (1970) Competence PerformanceMotives Purpose

Mayer et al (1995) Ability PerformanceIntegrity ProcessBenevolence Purpose

Mishra (1996) Competency PerformanceReliability PerformanceOpenness ProcessConcern Purpose

Moorman et al (1993) Integrity ProcessWillingness to reduce uncertainty ProcessConfidentiality ProcessExpertise PerformanceTactfulness ProcessSincerity ProcessCongeniality PerformanceTimeliness Performance

Rempel et al (1985) Reliability PerformanceDependability ProcessFaith Purpose

Sitkin amp Roth (1993) Context-specific reliability PerformanceGeneralized value congruence Purpose

Zuboff (1988) Trial and error experience PerformanceUnderstanding ProcessLeap of faith Purpose

TRUST IN AUTOMATION 61

(1995) and shows how the many characteristicsof a trustee that affect trust can be summarizedby the three bases of performance process andpurpose As an example Mishra (1996) com-bined a literature review and interviews with33 managers to identify four dimensions oftrust These dimensions can be linked to thethree bases of trust competency (performance)reliability (performance) openness (process) andconcern (purpose) The dimensions of purposeprocess and performance provide a concise setof distinctions that describe the basis of trustacross a wide range of application domainsThese dimensions clearly identify three typesof goal-oriented information that contribute todeveloping an appropriate level of trust

Trust depends not only on the observationsassociated with these three dimensions but alsoon the inferences drawn among them Observa-tion of performance can support inferences re-garding internal mechanisms associated withthe dimension of process analysis of internalmechanisms can support inferences regardingthe designerrsquos intent associated with the dimen-sion of purpose Likewise the underlying pro-cess can be inferred from knowledge of thedesignerrsquos intent and performance can be esti-mated from an understanding of the underlyingprocess If these inferences support trust it fol-lows that a system design that facilitates themwould promote a more appropriate level of trustIf inferences are inconsistent with observationsthen trust will probably suffer because of apoor correspondence with the observed agentSimilarly if inferences between levels are incon-sistent trust will suffer because of poor coher-ence For example inconsistency between theintentions conveyed by the manager (purposebasis for trust) and the managerrsquos actions (perfor-mance basis of trust) have a particularly negativeeffect on trust (Gabarro 1978) Likewise the ef-fort to demonstrate reliability and encouragetrust by enforcing procedural approaches maynot succeed if value-oriented concerns (egpurpose) are ignored (Sitkin amp Roth 1993)

In conditions where trust is based on severalfactors it can be quite stable or robust in situ-ations where trust depends on a single basis itcan be quite volatile or fragile (McKnight et al1998) Trust that is based on an understandingof the motives of the agent will be less fragile

than trust based only on the reliability of theagentrsquos performance (Rempel et al 1985)These findings help explain why trust in auto-mation can sometimes appear quite robust andat other times quite fragile (Kantowitz Hanow-ski amp Kantowitz 1997b) Both imperfect co-herence between dimensions and imperfectcorrespondence with observations are likely toundermine trust and are critical considerationsfor encouraging appropriate levels of trust

The availability of information at the differ-ent levels of detail and attributional abstractionmay lead to high calibration high resolutionand great temporal and functional specificity oftrust Designing interfaces and training to pro-vide operators with information regarding thepurpose process and performance of automa-tion could enhance the appropriateness of trustHowever the mere availability of informationwill not ensure appropriate trust The informa-tion must be presented in a manner that is con-sistent with the cognitive processes underlyingthe development of trust as described in thefollowing section

Analytic Analogical and AffectiveProcesses Governing Trust

Information that forms the basis of trust canbe assimilated in three qualitatively differentways Some argue that trust is based on an ana-lytic process others argue that trust dependson an analogical process of assessing categorymembership Most importantly trust also seemsto depend on an affective response to the vio-lation or confirmation of implicit expectan-cies Figure 3 shows how these processes mayinteract to influence trust Ultimately trust is anaffective response but it is also influenced byanalytic and analogical processes The bold linesin Figure 3 show that the affective process hasa greater influence on the analytic process thanthe analytic has on the affective (LoewensteinWeber Hsee amp Welch 2001) The roles of ana-lytic analogical and affective processes dependon the evolution of the relationship betweenthe trustor and trustee the information avail-able to the trustor and the way the informationis displayed

A dominant approach to trust in the organi-zational sciences considers trust from a rationalchoice perspective (Hardin 1992) This view

62 Spring 2004 ndash Human Factors

suggests that trust depends on an evaluationmade under uncertainty in which decision mak-ers use their knowledge of the motivations andinterests of the other party to maximize gainsand minimize losses Individuals are assumed tomake choices based on a rationally derived as-sessment of costs and benefits (Lewicki amp Bun-ker 1996) Williamson (1993) argued that trustis primarily analytic in business and commercialrelations and that it represents one aspect ofrational risk analysis Trust accumulates and dis-sipates based on the effect of cumulative inter-actions with the other agent These interactionslead to an accumulation of knowledge that es-tablishes trust through an expected value cal-culation in which the probabilities of variousoutcomes are estimated with increasing experi-ence (Holmes 1991) Trust can also depend onan analysis of how organizational and individualconstraints affect performance For exampletrust can be considered in terms of a rationalargument in which grounds for various conclu-sions are developed and defended according tothe evidence (Cohen 2000)

These approaches provide a useful evaluationof the normative level of trust but they may notdescribe how trust actually develops Analyticapproaches to trust probably overstate cognitivecapacities and understate the influence of affectand role of strategies to accommodate the lim-its of bounded rationality similar to normative

decision-making theories (Janis amp Mann 1977Simon 1957) Descriptive accounts of decisionmaking show that expert decision makers sel-dom engage in conscious calculations or in anexhaustive comparison of alternatives (Cross-man 1974 Klein 1989) The analytic processis similar to knowledge-based performance asdescribed by Rasmussen (1983) in which infor-mation is processed and plans are formulatedand evaluated using a function-based mentalmodel of the system Knowledge-based or ana-lytic processes are probably complemented byless cognitively demanding processes

A less cognitively demanding process is fortrust to develop according to analogical judg-ments that link levels of trust to characteristicsof the agent and environmental context Thesecategories develop through direct observationof the trusted agent intermediaries that conveytheir observations and presumptions based onstandards category membership and procedures(Kramer 1999)

Explicit and implicit rules frequently guideindividual and collective collaborative behavior(Mishra 1996) Sitkin and Roth (1993) illus-trated the importance of rules in their distinctionbetween trust as an institutional arrangementand interpersonal trust Institutional trust de-pends on contracts legal procedures and for-mal rules whereas interpersonal trust is basedon shared values The application of rules in

Figure 3 The interplay among the analytic analogical and affective process underlying trust with the analyticand analogical processes both contributing to the affective process and the affective process also having adominant influence on the analytic and analogical processes

TRUST IN AUTOMATION 63

institution-based trust reflects guarantees andsafety nets based on the organizational con-straints of regulations and legal recourses thatconstrain behavior and make it more possibleto trust (McKnight et al 1998) However whenfundamental values which frequently supportinterpersonal trust are violated legalistic ap-proaches and applications of rules to restoretrust are ineffective and can further underminetrust (Sitkin amp Roth 1993) When rules areconsistent with trustworthy behavior they canincrease peoplersquos expectation of satisfactoryperformance during times of normal operationby pairing situations with rule-based expecta-tions but rules can have a negative effect whensituations diverge from normal operation

Analogical trust can also depend on interme-diaries who can convey information to supportjudgments of trust (Burt amp Knez 1996) suchas reputations and gossip that enable individualsto develop trust without any direct contact Inthe context of trust in automation responsetimes to warnings tend to increase when falsealarms occur This effect was counteracted bygossip that suggested that the rate of false alarmswas lower than it actually was (Bliss Dunn ampFuller 1995) Similarly trust can be based onpresumptions of the trustworthiness of a cate-gory or role of the person rather than on theactual performance (Kramer Brewer amp Hanna1996) If trust is primarily based on rules thatcharacterize performance during normal situa-tions then abnormal situations or erroneouscategory membership might lead to the collapseof trust because the assurances associated withnormal or customary operation are no longervalid Because of this category-based trust canbe fragile particularly when abnormal situationsdisrupt otherwise stable roles For example whenroles were defined by a binding contract trust de-clined substantially when the contracts wereremoved (Malhotra amp Murnighan 2002)

Similar effects may occur with trust in auto-mation Specifically Miller (2002) suggested thatcomputer etiquette may have an important influ-ence on human-computer interaction Etiquettemay influence trust because category member-ship associated with adherence to a particularetiquette helps people to infer how the automa-tion will perform Intentionally developing com-puter etiquette could promote appropriate trust

but it could lead to inappropriate trust if peopleinfer inappropriate category memberships anddevelop distorted expectations regarding thecapability of the automation

These studies suggest that trust in automa-tion may depend on category judgments basedon a variety of sources that include direct andindirect interaction with the automation Be-cause these analogical judgments emerge fromcomplex interactions between people proce-dures and prolonged experience data from lab-oratory experiments may need to be carefullyassessed to determine how well they generalizeto actual operational settings In this way theanalogical process governing trust correspondsvery closely to Rasmussenrsquos (1983) concept ofrule-based behavior in which condition-actionpairings or procedures guide behavior

Analytic and analogical processes do not ac-count for the affective aspects of trust whichrepresent the core influence of trust on behav-ior (Kramer 1999) Emotional responses arecritical because people not only think abouttrust they also feel it (Fine amp Holyfield 1996)Emotions fluctuate over time according to theperformance of the trustee and they can signalinstances where expectations do not conform tothe ongoing experience As trust is betrayedemotions provide signals concerning the chang-ing nature of the situation (Jones amp George1998)

Berscheid (1994) used the concept of auto-matic processing to describe how behaviors thatare relevant to frequently accessed trait con-structs such as trust are likely to be perceivedunder attentional overload situations Becauseof this people process observations of new be-havior according to old constructs withoutalways being aware of doing so Automaticprocessing plays a substantial role in attribu-tional activities with many aspects of causalreasoning occurring outside conscious aware-ness In this way emotions bridge the gaps inrationality and enable people to focus theirlimited attentional capacity (Johnson-Laird ampOately 1992 Loewenstein et al 2001 Simon1967) Because the cognitive complexity ofrelationships can exceed a personrsquos capacity toform a complete mental model people cannotperfectly predict behavior and emotions serveto redistribute cognitive resources and manage

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 11: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

60 Spring 2004 ndash Human Factors

TABLE 1 Summary of the Dimensions That Describe the Basis of Trust

Study Basis of Trust Summary Dimension

Barber (1983) Competence PerformancePersistence ProcessFiduciary responsibility Purpose

Butler amp Cantrell (1984) Competence PerformanceIntegrity ProcessConsistency ProcessLoyalty PurposeOpenness Process

Cook amp Wall (1980) Ability PerformanceIntentions Purpose

Deutsch (1960) Ability PerformanceIntentions Purpose

Gabarro (1978) Integrity ProcessMotives PurposeOpenness ProcessDiscreetness ProcessFunctionalspecific competence PerformanceInterpersonal competence PerformanceBusiness sense PerformanceJudgment Performance

Hovland Janis amp Kelly (1953) Expertise PerformanceMotivation to lie Purpose

Jennings (1967) Loyalty PurposePredictability ProcessAccessibility ProcessAvailability Process

Kee amp Knox (1970) Competence PerformanceMotives Purpose

Mayer et al (1995) Ability PerformanceIntegrity ProcessBenevolence Purpose

Mishra (1996) Competency PerformanceReliability PerformanceOpenness ProcessConcern Purpose

Moorman et al (1993) Integrity ProcessWillingness to reduce uncertainty ProcessConfidentiality ProcessExpertise PerformanceTactfulness ProcessSincerity ProcessCongeniality PerformanceTimeliness Performance

Rempel et al (1985) Reliability PerformanceDependability ProcessFaith Purpose

Sitkin amp Roth (1993) Context-specific reliability PerformanceGeneralized value congruence Purpose

Zuboff (1988) Trial and error experience PerformanceUnderstanding ProcessLeap of faith Purpose

TRUST IN AUTOMATION 61

(1995) and shows how the many characteristicsof a trustee that affect trust can be summarizedby the three bases of performance process andpurpose As an example Mishra (1996) com-bined a literature review and interviews with33 managers to identify four dimensions oftrust These dimensions can be linked to thethree bases of trust competency (performance)reliability (performance) openness (process) andconcern (purpose) The dimensions of purposeprocess and performance provide a concise setof distinctions that describe the basis of trustacross a wide range of application domainsThese dimensions clearly identify three typesof goal-oriented information that contribute todeveloping an appropriate level of trust

Trust depends not only on the observationsassociated with these three dimensions but alsoon the inferences drawn among them Observa-tion of performance can support inferences re-garding internal mechanisms associated withthe dimension of process analysis of internalmechanisms can support inferences regardingthe designerrsquos intent associated with the dimen-sion of purpose Likewise the underlying pro-cess can be inferred from knowledge of thedesignerrsquos intent and performance can be esti-mated from an understanding of the underlyingprocess If these inferences support trust it fol-lows that a system design that facilitates themwould promote a more appropriate level of trustIf inferences are inconsistent with observationsthen trust will probably suffer because of apoor correspondence with the observed agentSimilarly if inferences between levels are incon-sistent trust will suffer because of poor coher-ence For example inconsistency between theintentions conveyed by the manager (purposebasis for trust) and the managerrsquos actions (perfor-mance basis of trust) have a particularly negativeeffect on trust (Gabarro 1978) Likewise the ef-fort to demonstrate reliability and encouragetrust by enforcing procedural approaches maynot succeed if value-oriented concerns (egpurpose) are ignored (Sitkin amp Roth 1993)

In conditions where trust is based on severalfactors it can be quite stable or robust in situ-ations where trust depends on a single basis itcan be quite volatile or fragile (McKnight et al1998) Trust that is based on an understandingof the motives of the agent will be less fragile

than trust based only on the reliability of theagentrsquos performance (Rempel et al 1985)These findings help explain why trust in auto-mation can sometimes appear quite robust andat other times quite fragile (Kantowitz Hanow-ski amp Kantowitz 1997b) Both imperfect co-herence between dimensions and imperfectcorrespondence with observations are likely toundermine trust and are critical considerationsfor encouraging appropriate levels of trust

The availability of information at the differ-ent levels of detail and attributional abstractionmay lead to high calibration high resolutionand great temporal and functional specificity oftrust Designing interfaces and training to pro-vide operators with information regarding thepurpose process and performance of automa-tion could enhance the appropriateness of trustHowever the mere availability of informationwill not ensure appropriate trust The informa-tion must be presented in a manner that is con-sistent with the cognitive processes underlyingthe development of trust as described in thefollowing section

Analytic Analogical and AffectiveProcesses Governing Trust

Information that forms the basis of trust canbe assimilated in three qualitatively differentways Some argue that trust is based on an ana-lytic process others argue that trust dependson an analogical process of assessing categorymembership Most importantly trust also seemsto depend on an affective response to the vio-lation or confirmation of implicit expectan-cies Figure 3 shows how these processes mayinteract to influence trust Ultimately trust is anaffective response but it is also influenced byanalytic and analogical processes The bold linesin Figure 3 show that the affective process hasa greater influence on the analytic process thanthe analytic has on the affective (LoewensteinWeber Hsee amp Welch 2001) The roles of ana-lytic analogical and affective processes dependon the evolution of the relationship betweenthe trustor and trustee the information avail-able to the trustor and the way the informationis displayed

A dominant approach to trust in the organi-zational sciences considers trust from a rationalchoice perspective (Hardin 1992) This view

62 Spring 2004 ndash Human Factors

suggests that trust depends on an evaluationmade under uncertainty in which decision mak-ers use their knowledge of the motivations andinterests of the other party to maximize gainsand minimize losses Individuals are assumed tomake choices based on a rationally derived as-sessment of costs and benefits (Lewicki amp Bun-ker 1996) Williamson (1993) argued that trustis primarily analytic in business and commercialrelations and that it represents one aspect ofrational risk analysis Trust accumulates and dis-sipates based on the effect of cumulative inter-actions with the other agent These interactionslead to an accumulation of knowledge that es-tablishes trust through an expected value cal-culation in which the probabilities of variousoutcomes are estimated with increasing experi-ence (Holmes 1991) Trust can also depend onan analysis of how organizational and individualconstraints affect performance For exampletrust can be considered in terms of a rationalargument in which grounds for various conclu-sions are developed and defended according tothe evidence (Cohen 2000)

These approaches provide a useful evaluationof the normative level of trust but they may notdescribe how trust actually develops Analyticapproaches to trust probably overstate cognitivecapacities and understate the influence of affectand role of strategies to accommodate the lim-its of bounded rationality similar to normative

decision-making theories (Janis amp Mann 1977Simon 1957) Descriptive accounts of decisionmaking show that expert decision makers sel-dom engage in conscious calculations or in anexhaustive comparison of alternatives (Cross-man 1974 Klein 1989) The analytic processis similar to knowledge-based performance asdescribed by Rasmussen (1983) in which infor-mation is processed and plans are formulatedand evaluated using a function-based mentalmodel of the system Knowledge-based or ana-lytic processes are probably complemented byless cognitively demanding processes

A less cognitively demanding process is fortrust to develop according to analogical judg-ments that link levels of trust to characteristicsof the agent and environmental context Thesecategories develop through direct observationof the trusted agent intermediaries that conveytheir observations and presumptions based onstandards category membership and procedures(Kramer 1999)

Explicit and implicit rules frequently guideindividual and collective collaborative behavior(Mishra 1996) Sitkin and Roth (1993) illus-trated the importance of rules in their distinctionbetween trust as an institutional arrangementand interpersonal trust Institutional trust de-pends on contracts legal procedures and for-mal rules whereas interpersonal trust is basedon shared values The application of rules in

Figure 3 The interplay among the analytic analogical and affective process underlying trust with the analyticand analogical processes both contributing to the affective process and the affective process also having adominant influence on the analytic and analogical processes

TRUST IN AUTOMATION 63

institution-based trust reflects guarantees andsafety nets based on the organizational con-straints of regulations and legal recourses thatconstrain behavior and make it more possibleto trust (McKnight et al 1998) However whenfundamental values which frequently supportinterpersonal trust are violated legalistic ap-proaches and applications of rules to restoretrust are ineffective and can further underminetrust (Sitkin amp Roth 1993) When rules areconsistent with trustworthy behavior they canincrease peoplersquos expectation of satisfactoryperformance during times of normal operationby pairing situations with rule-based expecta-tions but rules can have a negative effect whensituations diverge from normal operation

Analogical trust can also depend on interme-diaries who can convey information to supportjudgments of trust (Burt amp Knez 1996) suchas reputations and gossip that enable individualsto develop trust without any direct contact Inthe context of trust in automation responsetimes to warnings tend to increase when falsealarms occur This effect was counteracted bygossip that suggested that the rate of false alarmswas lower than it actually was (Bliss Dunn ampFuller 1995) Similarly trust can be based onpresumptions of the trustworthiness of a cate-gory or role of the person rather than on theactual performance (Kramer Brewer amp Hanna1996) If trust is primarily based on rules thatcharacterize performance during normal situa-tions then abnormal situations or erroneouscategory membership might lead to the collapseof trust because the assurances associated withnormal or customary operation are no longervalid Because of this category-based trust canbe fragile particularly when abnormal situationsdisrupt otherwise stable roles For example whenroles were defined by a binding contract trust de-clined substantially when the contracts wereremoved (Malhotra amp Murnighan 2002)

Similar effects may occur with trust in auto-mation Specifically Miller (2002) suggested thatcomputer etiquette may have an important influ-ence on human-computer interaction Etiquettemay influence trust because category member-ship associated with adherence to a particularetiquette helps people to infer how the automa-tion will perform Intentionally developing com-puter etiquette could promote appropriate trust

but it could lead to inappropriate trust if peopleinfer inappropriate category memberships anddevelop distorted expectations regarding thecapability of the automation

These studies suggest that trust in automa-tion may depend on category judgments basedon a variety of sources that include direct andindirect interaction with the automation Be-cause these analogical judgments emerge fromcomplex interactions between people proce-dures and prolonged experience data from lab-oratory experiments may need to be carefullyassessed to determine how well they generalizeto actual operational settings In this way theanalogical process governing trust correspondsvery closely to Rasmussenrsquos (1983) concept ofrule-based behavior in which condition-actionpairings or procedures guide behavior

Analytic and analogical processes do not ac-count for the affective aspects of trust whichrepresent the core influence of trust on behav-ior (Kramer 1999) Emotional responses arecritical because people not only think abouttrust they also feel it (Fine amp Holyfield 1996)Emotions fluctuate over time according to theperformance of the trustee and they can signalinstances where expectations do not conform tothe ongoing experience As trust is betrayedemotions provide signals concerning the chang-ing nature of the situation (Jones amp George1998)

Berscheid (1994) used the concept of auto-matic processing to describe how behaviors thatare relevant to frequently accessed trait con-structs such as trust are likely to be perceivedunder attentional overload situations Becauseof this people process observations of new be-havior according to old constructs withoutalways being aware of doing so Automaticprocessing plays a substantial role in attribu-tional activities with many aspects of causalreasoning occurring outside conscious aware-ness In this way emotions bridge the gaps inrationality and enable people to focus theirlimited attentional capacity (Johnson-Laird ampOately 1992 Loewenstein et al 2001 Simon1967) Because the cognitive complexity ofrelationships can exceed a personrsquos capacity toform a complete mental model people cannotperfectly predict behavior and emotions serveto redistribute cognitive resources and manage

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 12: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 61

(1995) and shows how the many characteristicsof a trustee that affect trust can be summarizedby the three bases of performance process andpurpose As an example Mishra (1996) com-bined a literature review and interviews with33 managers to identify four dimensions oftrust These dimensions can be linked to thethree bases of trust competency (performance)reliability (performance) openness (process) andconcern (purpose) The dimensions of purposeprocess and performance provide a concise setof distinctions that describe the basis of trustacross a wide range of application domainsThese dimensions clearly identify three typesof goal-oriented information that contribute todeveloping an appropriate level of trust

Trust depends not only on the observationsassociated with these three dimensions but alsoon the inferences drawn among them Observa-tion of performance can support inferences re-garding internal mechanisms associated withthe dimension of process analysis of internalmechanisms can support inferences regardingthe designerrsquos intent associated with the dimen-sion of purpose Likewise the underlying pro-cess can be inferred from knowledge of thedesignerrsquos intent and performance can be esti-mated from an understanding of the underlyingprocess If these inferences support trust it fol-lows that a system design that facilitates themwould promote a more appropriate level of trustIf inferences are inconsistent with observationsthen trust will probably suffer because of apoor correspondence with the observed agentSimilarly if inferences between levels are incon-sistent trust will suffer because of poor coher-ence For example inconsistency between theintentions conveyed by the manager (purposebasis for trust) and the managerrsquos actions (perfor-mance basis of trust) have a particularly negativeeffect on trust (Gabarro 1978) Likewise the ef-fort to demonstrate reliability and encouragetrust by enforcing procedural approaches maynot succeed if value-oriented concerns (egpurpose) are ignored (Sitkin amp Roth 1993)

In conditions where trust is based on severalfactors it can be quite stable or robust in situ-ations where trust depends on a single basis itcan be quite volatile or fragile (McKnight et al1998) Trust that is based on an understandingof the motives of the agent will be less fragile

than trust based only on the reliability of theagentrsquos performance (Rempel et al 1985)These findings help explain why trust in auto-mation can sometimes appear quite robust andat other times quite fragile (Kantowitz Hanow-ski amp Kantowitz 1997b) Both imperfect co-herence between dimensions and imperfectcorrespondence with observations are likely toundermine trust and are critical considerationsfor encouraging appropriate levels of trust

The availability of information at the differ-ent levels of detail and attributional abstractionmay lead to high calibration high resolutionand great temporal and functional specificity oftrust Designing interfaces and training to pro-vide operators with information regarding thepurpose process and performance of automa-tion could enhance the appropriateness of trustHowever the mere availability of informationwill not ensure appropriate trust The informa-tion must be presented in a manner that is con-sistent with the cognitive processes underlyingthe development of trust as described in thefollowing section

Analytic Analogical and AffectiveProcesses Governing Trust

Information that forms the basis of trust canbe assimilated in three qualitatively differentways Some argue that trust is based on an ana-lytic process others argue that trust dependson an analogical process of assessing categorymembership Most importantly trust also seemsto depend on an affective response to the vio-lation or confirmation of implicit expectan-cies Figure 3 shows how these processes mayinteract to influence trust Ultimately trust is anaffective response but it is also influenced byanalytic and analogical processes The bold linesin Figure 3 show that the affective process hasa greater influence on the analytic process thanthe analytic has on the affective (LoewensteinWeber Hsee amp Welch 2001) The roles of ana-lytic analogical and affective processes dependon the evolution of the relationship betweenthe trustor and trustee the information avail-able to the trustor and the way the informationis displayed

A dominant approach to trust in the organi-zational sciences considers trust from a rationalchoice perspective (Hardin 1992) This view

62 Spring 2004 ndash Human Factors

suggests that trust depends on an evaluationmade under uncertainty in which decision mak-ers use their knowledge of the motivations andinterests of the other party to maximize gainsand minimize losses Individuals are assumed tomake choices based on a rationally derived as-sessment of costs and benefits (Lewicki amp Bun-ker 1996) Williamson (1993) argued that trustis primarily analytic in business and commercialrelations and that it represents one aspect ofrational risk analysis Trust accumulates and dis-sipates based on the effect of cumulative inter-actions with the other agent These interactionslead to an accumulation of knowledge that es-tablishes trust through an expected value cal-culation in which the probabilities of variousoutcomes are estimated with increasing experi-ence (Holmes 1991) Trust can also depend onan analysis of how organizational and individualconstraints affect performance For exampletrust can be considered in terms of a rationalargument in which grounds for various conclu-sions are developed and defended according tothe evidence (Cohen 2000)

These approaches provide a useful evaluationof the normative level of trust but they may notdescribe how trust actually develops Analyticapproaches to trust probably overstate cognitivecapacities and understate the influence of affectand role of strategies to accommodate the lim-its of bounded rationality similar to normative

decision-making theories (Janis amp Mann 1977Simon 1957) Descriptive accounts of decisionmaking show that expert decision makers sel-dom engage in conscious calculations or in anexhaustive comparison of alternatives (Cross-man 1974 Klein 1989) The analytic processis similar to knowledge-based performance asdescribed by Rasmussen (1983) in which infor-mation is processed and plans are formulatedand evaluated using a function-based mentalmodel of the system Knowledge-based or ana-lytic processes are probably complemented byless cognitively demanding processes

A less cognitively demanding process is fortrust to develop according to analogical judg-ments that link levels of trust to characteristicsof the agent and environmental context Thesecategories develop through direct observationof the trusted agent intermediaries that conveytheir observations and presumptions based onstandards category membership and procedures(Kramer 1999)

Explicit and implicit rules frequently guideindividual and collective collaborative behavior(Mishra 1996) Sitkin and Roth (1993) illus-trated the importance of rules in their distinctionbetween trust as an institutional arrangementand interpersonal trust Institutional trust de-pends on contracts legal procedures and for-mal rules whereas interpersonal trust is basedon shared values The application of rules in

Figure 3 The interplay among the analytic analogical and affective process underlying trust with the analyticand analogical processes both contributing to the affective process and the affective process also having adominant influence on the analytic and analogical processes

TRUST IN AUTOMATION 63

institution-based trust reflects guarantees andsafety nets based on the organizational con-straints of regulations and legal recourses thatconstrain behavior and make it more possibleto trust (McKnight et al 1998) However whenfundamental values which frequently supportinterpersonal trust are violated legalistic ap-proaches and applications of rules to restoretrust are ineffective and can further underminetrust (Sitkin amp Roth 1993) When rules areconsistent with trustworthy behavior they canincrease peoplersquos expectation of satisfactoryperformance during times of normal operationby pairing situations with rule-based expecta-tions but rules can have a negative effect whensituations diverge from normal operation

Analogical trust can also depend on interme-diaries who can convey information to supportjudgments of trust (Burt amp Knez 1996) suchas reputations and gossip that enable individualsto develop trust without any direct contact Inthe context of trust in automation responsetimes to warnings tend to increase when falsealarms occur This effect was counteracted bygossip that suggested that the rate of false alarmswas lower than it actually was (Bliss Dunn ampFuller 1995) Similarly trust can be based onpresumptions of the trustworthiness of a cate-gory or role of the person rather than on theactual performance (Kramer Brewer amp Hanna1996) If trust is primarily based on rules thatcharacterize performance during normal situa-tions then abnormal situations or erroneouscategory membership might lead to the collapseof trust because the assurances associated withnormal or customary operation are no longervalid Because of this category-based trust canbe fragile particularly when abnormal situationsdisrupt otherwise stable roles For example whenroles were defined by a binding contract trust de-clined substantially when the contracts wereremoved (Malhotra amp Murnighan 2002)

Similar effects may occur with trust in auto-mation Specifically Miller (2002) suggested thatcomputer etiquette may have an important influ-ence on human-computer interaction Etiquettemay influence trust because category member-ship associated with adherence to a particularetiquette helps people to infer how the automa-tion will perform Intentionally developing com-puter etiquette could promote appropriate trust

but it could lead to inappropriate trust if peopleinfer inappropriate category memberships anddevelop distorted expectations regarding thecapability of the automation

These studies suggest that trust in automa-tion may depend on category judgments basedon a variety of sources that include direct andindirect interaction with the automation Be-cause these analogical judgments emerge fromcomplex interactions between people proce-dures and prolonged experience data from lab-oratory experiments may need to be carefullyassessed to determine how well they generalizeto actual operational settings In this way theanalogical process governing trust correspondsvery closely to Rasmussenrsquos (1983) concept ofrule-based behavior in which condition-actionpairings or procedures guide behavior

Analytic and analogical processes do not ac-count for the affective aspects of trust whichrepresent the core influence of trust on behav-ior (Kramer 1999) Emotional responses arecritical because people not only think abouttrust they also feel it (Fine amp Holyfield 1996)Emotions fluctuate over time according to theperformance of the trustee and they can signalinstances where expectations do not conform tothe ongoing experience As trust is betrayedemotions provide signals concerning the chang-ing nature of the situation (Jones amp George1998)

Berscheid (1994) used the concept of auto-matic processing to describe how behaviors thatare relevant to frequently accessed trait con-structs such as trust are likely to be perceivedunder attentional overload situations Becauseof this people process observations of new be-havior according to old constructs withoutalways being aware of doing so Automaticprocessing plays a substantial role in attribu-tional activities with many aspects of causalreasoning occurring outside conscious aware-ness In this way emotions bridge the gaps inrationality and enable people to focus theirlimited attentional capacity (Johnson-Laird ampOately 1992 Loewenstein et al 2001 Simon1967) Because the cognitive complexity ofrelationships can exceed a personrsquos capacity toform a complete mental model people cannotperfectly predict behavior and emotions serveto redistribute cognitive resources and manage

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 13: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

62 Spring 2004 ndash Human Factors

suggests that trust depends on an evaluationmade under uncertainty in which decision mak-ers use their knowledge of the motivations andinterests of the other party to maximize gainsand minimize losses Individuals are assumed tomake choices based on a rationally derived as-sessment of costs and benefits (Lewicki amp Bun-ker 1996) Williamson (1993) argued that trustis primarily analytic in business and commercialrelations and that it represents one aspect ofrational risk analysis Trust accumulates and dis-sipates based on the effect of cumulative inter-actions with the other agent These interactionslead to an accumulation of knowledge that es-tablishes trust through an expected value cal-culation in which the probabilities of variousoutcomes are estimated with increasing experi-ence (Holmes 1991) Trust can also depend onan analysis of how organizational and individualconstraints affect performance For exampletrust can be considered in terms of a rationalargument in which grounds for various conclu-sions are developed and defended according tothe evidence (Cohen 2000)

These approaches provide a useful evaluationof the normative level of trust but they may notdescribe how trust actually develops Analyticapproaches to trust probably overstate cognitivecapacities and understate the influence of affectand role of strategies to accommodate the lim-its of bounded rationality similar to normative

decision-making theories (Janis amp Mann 1977Simon 1957) Descriptive accounts of decisionmaking show that expert decision makers sel-dom engage in conscious calculations or in anexhaustive comparison of alternatives (Cross-man 1974 Klein 1989) The analytic processis similar to knowledge-based performance asdescribed by Rasmussen (1983) in which infor-mation is processed and plans are formulatedand evaluated using a function-based mentalmodel of the system Knowledge-based or ana-lytic processes are probably complemented byless cognitively demanding processes

A less cognitively demanding process is fortrust to develop according to analogical judg-ments that link levels of trust to characteristicsof the agent and environmental context Thesecategories develop through direct observationof the trusted agent intermediaries that conveytheir observations and presumptions based onstandards category membership and procedures(Kramer 1999)

Explicit and implicit rules frequently guideindividual and collective collaborative behavior(Mishra 1996) Sitkin and Roth (1993) illus-trated the importance of rules in their distinctionbetween trust as an institutional arrangementand interpersonal trust Institutional trust de-pends on contracts legal procedures and for-mal rules whereas interpersonal trust is basedon shared values The application of rules in

Figure 3 The interplay among the analytic analogical and affective process underlying trust with the analyticand analogical processes both contributing to the affective process and the affective process also having adominant influence on the analytic and analogical processes

TRUST IN AUTOMATION 63

institution-based trust reflects guarantees andsafety nets based on the organizational con-straints of regulations and legal recourses thatconstrain behavior and make it more possibleto trust (McKnight et al 1998) However whenfundamental values which frequently supportinterpersonal trust are violated legalistic ap-proaches and applications of rules to restoretrust are ineffective and can further underminetrust (Sitkin amp Roth 1993) When rules areconsistent with trustworthy behavior they canincrease peoplersquos expectation of satisfactoryperformance during times of normal operationby pairing situations with rule-based expecta-tions but rules can have a negative effect whensituations diverge from normal operation

Analogical trust can also depend on interme-diaries who can convey information to supportjudgments of trust (Burt amp Knez 1996) suchas reputations and gossip that enable individualsto develop trust without any direct contact Inthe context of trust in automation responsetimes to warnings tend to increase when falsealarms occur This effect was counteracted bygossip that suggested that the rate of false alarmswas lower than it actually was (Bliss Dunn ampFuller 1995) Similarly trust can be based onpresumptions of the trustworthiness of a cate-gory or role of the person rather than on theactual performance (Kramer Brewer amp Hanna1996) If trust is primarily based on rules thatcharacterize performance during normal situa-tions then abnormal situations or erroneouscategory membership might lead to the collapseof trust because the assurances associated withnormal or customary operation are no longervalid Because of this category-based trust canbe fragile particularly when abnormal situationsdisrupt otherwise stable roles For example whenroles were defined by a binding contract trust de-clined substantially when the contracts wereremoved (Malhotra amp Murnighan 2002)

Similar effects may occur with trust in auto-mation Specifically Miller (2002) suggested thatcomputer etiquette may have an important influ-ence on human-computer interaction Etiquettemay influence trust because category member-ship associated with adherence to a particularetiquette helps people to infer how the automa-tion will perform Intentionally developing com-puter etiquette could promote appropriate trust

but it could lead to inappropriate trust if peopleinfer inappropriate category memberships anddevelop distorted expectations regarding thecapability of the automation

These studies suggest that trust in automa-tion may depend on category judgments basedon a variety of sources that include direct andindirect interaction with the automation Be-cause these analogical judgments emerge fromcomplex interactions between people proce-dures and prolonged experience data from lab-oratory experiments may need to be carefullyassessed to determine how well they generalizeto actual operational settings In this way theanalogical process governing trust correspondsvery closely to Rasmussenrsquos (1983) concept ofrule-based behavior in which condition-actionpairings or procedures guide behavior

Analytic and analogical processes do not ac-count for the affective aspects of trust whichrepresent the core influence of trust on behav-ior (Kramer 1999) Emotional responses arecritical because people not only think abouttrust they also feel it (Fine amp Holyfield 1996)Emotions fluctuate over time according to theperformance of the trustee and they can signalinstances where expectations do not conform tothe ongoing experience As trust is betrayedemotions provide signals concerning the chang-ing nature of the situation (Jones amp George1998)

Berscheid (1994) used the concept of auto-matic processing to describe how behaviors thatare relevant to frequently accessed trait con-structs such as trust are likely to be perceivedunder attentional overload situations Becauseof this people process observations of new be-havior according to old constructs withoutalways being aware of doing so Automaticprocessing plays a substantial role in attribu-tional activities with many aspects of causalreasoning occurring outside conscious aware-ness In this way emotions bridge the gaps inrationality and enable people to focus theirlimited attentional capacity (Johnson-Laird ampOately 1992 Loewenstein et al 2001 Simon1967) Because the cognitive complexity ofrelationships can exceed a personrsquos capacity toform a complete mental model people cannotperfectly predict behavior and emotions serveto redistribute cognitive resources and manage

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 14: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 63

institution-based trust reflects guarantees andsafety nets based on the organizational con-straints of regulations and legal recourses thatconstrain behavior and make it more possibleto trust (McKnight et al 1998) However whenfundamental values which frequently supportinterpersonal trust are violated legalistic ap-proaches and applications of rules to restoretrust are ineffective and can further underminetrust (Sitkin amp Roth 1993) When rules areconsistent with trustworthy behavior they canincrease peoplersquos expectation of satisfactoryperformance during times of normal operationby pairing situations with rule-based expecta-tions but rules can have a negative effect whensituations diverge from normal operation

Analogical trust can also depend on interme-diaries who can convey information to supportjudgments of trust (Burt amp Knez 1996) suchas reputations and gossip that enable individualsto develop trust without any direct contact Inthe context of trust in automation responsetimes to warnings tend to increase when falsealarms occur This effect was counteracted bygossip that suggested that the rate of false alarmswas lower than it actually was (Bliss Dunn ampFuller 1995) Similarly trust can be based onpresumptions of the trustworthiness of a cate-gory or role of the person rather than on theactual performance (Kramer Brewer amp Hanna1996) If trust is primarily based on rules thatcharacterize performance during normal situa-tions then abnormal situations or erroneouscategory membership might lead to the collapseof trust because the assurances associated withnormal or customary operation are no longervalid Because of this category-based trust canbe fragile particularly when abnormal situationsdisrupt otherwise stable roles For example whenroles were defined by a binding contract trust de-clined substantially when the contracts wereremoved (Malhotra amp Murnighan 2002)

Similar effects may occur with trust in auto-mation Specifically Miller (2002) suggested thatcomputer etiquette may have an important influ-ence on human-computer interaction Etiquettemay influence trust because category member-ship associated with adherence to a particularetiquette helps people to infer how the automa-tion will perform Intentionally developing com-puter etiquette could promote appropriate trust

but it could lead to inappropriate trust if peopleinfer inappropriate category memberships anddevelop distorted expectations regarding thecapability of the automation

These studies suggest that trust in automa-tion may depend on category judgments basedon a variety of sources that include direct andindirect interaction with the automation Be-cause these analogical judgments emerge fromcomplex interactions between people proce-dures and prolonged experience data from lab-oratory experiments may need to be carefullyassessed to determine how well they generalizeto actual operational settings In this way theanalogical process governing trust correspondsvery closely to Rasmussenrsquos (1983) concept ofrule-based behavior in which condition-actionpairings or procedures guide behavior

Analytic and analogical processes do not ac-count for the affective aspects of trust whichrepresent the core influence of trust on behav-ior (Kramer 1999) Emotional responses arecritical because people not only think abouttrust they also feel it (Fine amp Holyfield 1996)Emotions fluctuate over time according to theperformance of the trustee and they can signalinstances where expectations do not conform tothe ongoing experience As trust is betrayedemotions provide signals concerning the chang-ing nature of the situation (Jones amp George1998)

Berscheid (1994) used the concept of auto-matic processing to describe how behaviors thatare relevant to frequently accessed trait con-structs such as trust are likely to be perceivedunder attentional overload situations Becauseof this people process observations of new be-havior according to old constructs withoutalways being aware of doing so Automaticprocessing plays a substantial role in attribu-tional activities with many aspects of causalreasoning occurring outside conscious aware-ness In this way emotions bridge the gaps inrationality and enable people to focus theirlimited attentional capacity (Johnson-Laird ampOately 1992 Loewenstein et al 2001 Simon1967) Because the cognitive complexity ofrelationships can exceed a personrsquos capacity toform a complete mental model people cannotperfectly predict behavior and emotions serveto redistribute cognitive resources and manage

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 15: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

64 Spring 2004 ndash Human Factors

priorities (Johnson-Laird amp Oately) Emotionscan guide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice

Recent neurological evidence suggests thataffect may play an important role in decisionmaking even when cognitive resources areavailable to support a calculated choice Thisresearch is important because it suggests aneurologically based description for how trustmight influence reliance Damasio Tranel andDamasio (1990) showed that although peoplewith brain lesions in the ventromedial sector ofthe prefrontal cortices retain reasoning and othercognitive abilities their emotions and decision-making ability are critically impaired A seriesof studies have demonstrated that the decision-making deficit stems from a lack of affect andnot from deficits of working memory declarativeknowledge or reasoning as might be expected(Bechara Damasio Tranel amp Anderson 1998Bechara Damasio Tranel amp Damasio 1997)The somatic marker hypothesis explains thiseffect by suggesting that marker signals fromthe physiological response to the emotionalaspects of decision situations influence the pro-cessing of information and the subsequentresponse to similar decision situations

Emotions help to guide people away fromsituations in which negative outcomes are likely(Damasio1996) In a simple gambling decision-making task patients with prefrontal lesions per-formed much worse than did a control groupof healthy participants The patients respondedto immediate prospects and failed to accommo-date long-term consequences (Bechara DamasioDamasio amp Anderson 1994) In a subsequentstudy healthy participants showed a substantialresponse to a large loss as measured by skinconductance response (SCR) a physiologicalmeasure of emotion whereas patients with pre-frontal lesions did not (Bechara et al 1997)Interestingly the healthy participants alsodemonstrated an anticipatory SCR and beganto avoid risky choices before they explicitlyrecognized the alternative as being risky Theseresults parallel the work of others and stronglysupport the argument that emotions play animportant role in decision making (Loewensteinet al 2001 Schwarz 2000 Sloman 1996)and that emotional reactions may be a critical

element of trust and the decision to rely on auto-mation

Emotion influences not only individual deci-sion making but also cooperation and socialinteraction in groups (Damasio 2003) Specifi-cally Rilling et al (2002) used an iterated prison-ersrsquo dilemma game to examine the neurologicalbasis for social cooperation and found system-atic patterns of activation in the anteroventralstriatum and orbital frontal cortex after coop-erative outcomes They also found substantialindividual differences regarding the level ofactivation and that these differences are strong-ly associated with the propensity to engage incooperative behavior These patterns may reflectemotions of trust and goodwill associated withsuccessful cooperation (Rilling et al) Interest-ingly orbital frontal cortex activation is notlimited to human-human interactions It alsooccurs with human-computer interactions whenthe computer responds to the humanrsquos behaviorin a cooperative manner however the ante-roventral striatum activation is absent duringthe computer interactions (Rilling et al)

Emotion also plays an important role by sup-porting implicit communication between severalpeople (Pally 1998) Each personrsquos tone rhythmand quality of speech convey information abouthis or her emotional state and so regulate thephysiological emotional responses of everyoneinvolved People spontaneously match nonverbalcues to generate emotional attunement Emo-tional nonverbal exchanges are a critical com-plement to the analytic information in a verbalexchange (Pally) These results seem consistentwith recent findings regarding interpersonal trustin which anonymous computerized messageswere less effective than face-to-face communica-tion because individuals judge trustworthinessfrom facial expressions and from hearing theway others talk (Ostrom 1998) Less subtlecues such as violation of etiquette rules whichinclude the Gricean maxims of communication(Grice 1975) may also lead to negative affectand undermine trust

In the context of process control Zuboff(1988) quoted an operatorrsquos description of thediscomfort that occurs when diverse cues areeliminated through computerization ldquoI used tolisten to sounds the boiler makes and knowjust how it was running I could look at the fire

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 16: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 65

in the furnace and tell by its color how it wasburning I knew what kinds of adjustments wereneeded by the shades of the color I saw A lotof the men also said that there were smells thattold you different things about how it was run-ning I feel uncomfortable being away fromthese sights and smellsrdquo (p 63)

These results show that trust may depend ona wide range of cues and that the affective basisof trust can be quite sensitive to subtle ele-ments of human-human and human-automationinteractions Promoting appropriate trust inautomation may depend on presenting informa-tion about the automation in manner compati-ble with the analytic analogical and affectiveprocesses that influence trust

GENERALIZING TRUST IN PEOPLE TOTRUST IN AUTOMATION

Trust has emerged as an important conceptin describing relationships between people andthis research may provide a good foundationfor describing relationships between people andautomation However relationships betweenpeople are qualitatively different from relation-ships between people and automation so wenow examine empirical and theoretical consid-erations in generalizing trust in people to trustin automation

Research has addressed trust in automationin a wide range of domains For example trustwas an important explanatory variable in under-standing driversrsquo reactions to imperfect trafficcongestion information provided by an automo-bile navigation system (Fox amp Boehm-Davis1998 Kantowitz Hanowski amp Kantowitz1997a) Also in the driving domain low trustin automated hazard signs combined with lowself-confidence created a double-bind situationthat increased driver workload (Lee Gore ampCampbell 1999) Trust has also helped explainreliance on augmented vision systems for targetidentification (Conejo amp Wickens 1997 Dzin-dolet Pierce Beck Dawe amp Anderson 2001)and pilotsrsquo perception of cockpit automation(Tenney Rogers amp Pew 1998) Trust and self-confidence have also emerged as the critical fac-tors in investigations into human-automationmismatches in the context of machining (CaseSinclair amp Rani 1999) and in the control of ateleoperated robot (Dassonville Jolly amp Desodt

1996) Similarly trust seems to play an impor-tant role in discrete manufacturing systems(Trentesaux Moray amp Tahon 1998) and contin-uous process control (Lee amp Moray 1994 Muir1989 Muir amp Moray 1996)

Recently trust has also emerged as a usefulconcept to describe interaction with Internet-based applications such as Internet shopping(Corritore Kracher amp Wiedenbeck 2001b Leeamp Turban 2001) and Internet banking (Kim ampMoon 1998) Some argue that trust will becomeincreasingly important as the metaphor for com-puters moves from inanimate tools to animatesoftware agents (Castelfranchi 1998 Castel-franchi amp Falcone 2000 Lewis 1998 Milewskiamp Lewis 1997) and as computer-supportedcooperative work becomes more commonplace(Van House Butler amp Schiff 1998) In combi-nation these results show that trust may influ-ence misuse and disuse of automation andcomputer technology in many domains

Research has demonstrated the importanceof trust in automation using a wide range ofexperimental and observational approachesSeveral studies have examined trust and affectusing synthetic laboratory tasks that have nodirect connection to real systems (Bechara et al1997 Riley 1996) but the majority have usedmicroworlds (Inagaki Takae amp Moray 1999Lee amp Moray 1994 Moray amp Inagaki 1999) Amicroworld is a simplified version of a real sys-tem in which the essential elements are retainedand the complexities eliminated to make exper-imental control possible (Brehmer amp Dorner1993) Part-task and fully immersive drivingsimulators are examples of microworlds andhave been used to investigate the role of trustin route-planning systems (Kantowitz et al1997a) and road condition warning systems(Lee et al 1999) Field studies have also re-vealed the importance of trust as demonstratedby observations of the use of autopilots andflight management systems (Mosier Skitka ampKorte 1994) maritime navigation systems(Lee amp Sanquist 1996) and systems in processplants (Zuboff 1988) The range of investiga-tive approaches demonstrates that trust is notan artifact of controlled laboratory studies andshows that it plays an important role in actualwork environments

Although substantial research suggests that

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 17: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

66 Spring 2004 ndash Human Factors

trust is a valid construct in describing human-automation interaction several fundamentaldifferences between human-human and human-automation trust deserve consideration in ex-trapolating from the research on human-humantrust A fundamental challenge in extrapolatingfrom this research to trust in automation isthat automation lacks intentionality This be-comes particularly challenging for the dimensionof purpose in which trust depends on featuressuch as loyalty benevolence and value congru-ence These features reflect the intentionality ofthe trustee and are the most fundamental andcentral elements of trust between people Al-though automation does not exhibit intentional-ity or truly autonomous behavior it is designedwith a purpose and thus embodies the intention-ality of the designers (Rasmussen Pejterson ampGoodstein 1994) In addition people may at-tribute intentionality and impute motivation toautomation as it becomes increasingly sophisti-cated and takes on human characteristics suchas speech communication (Barley 1988 Nassamp Lee 2001 Nass amp Moon 2000)

Another important difference between inter-personal trust and trust in automation is thattrust between people is often part of a socialexchange relationship Often trust is defined in arepeated-decision situation in which it emergesfrom the interaction between two parties (Sato1999) In this situation a person may act in atrustworthy manner to elicit a favorable responsefrom another person (Mayer et al 1995) Thereis symmetry to interpersonal trust in which thetrustor and trustee are each aware of the otherrsquosbehavior intents and trust (Deutsch 1960)How one is perceived by the other influencesbehavior There is no such symmetry in the trustbetween people and machines and this leadspeople to trust and respond to automation dif-ferently

Lewandowsky et al (2000) found that dele-gation to human collaborators was slightly dif-ferent from delegation to automation In theirstudy participants operated a semiautomaticpasteurization plant in which control of thepump could be delegated In one conditionoperators could delegate to an automatic con-troller and in another condition they could del-egate control to what they thought was anotheroperator (in reality the pump was controlled by

the same automatic controller) Interestinglythey found that delegation to the human butnot to automation depends on peoplersquos assess-ment of how others perceive them People aremore likely to delegate if they perceive their owntrustworthiness to be low (Lewandowsky et al)They also found the degree of trust to be morestrongly related to the decision to delegate tothe automation as compared with the decisionto delegate to the human One explanation forthese results is that operators may perceive theultimate responsibility in a human-automationpartnership to lie with the operator whereasoperators in a human-human partnership mayperceive the ultimate responsibility as beingshared (Lewandowsky et al) Similarly peopleare more likely to disuse automated aids thanthey are human aids even though self-reportssuggest a preference for automated aids (Dzin-dolet Pierce Beck amp Dawe 2002)

These results show that fundamental differ-ences in the symmetry of the exchange rela-tionship such as the ultimate responsibility forsystem control may lead to a different profileof factors influencing human-human delega-tion and human-automation delegation

Another important difference between inter-personal trust and trust in automation is theattribution process Interpersonal trust evolvesparticularly in romantic relationships It oftenbegins with a basis in performance or reliabili-ty then progresses to a process or dependabilitybasis and finally evolves to a purpose or faithbasis (Rempel et al 1985) Muir (1987 1994)has argued that trust in automation developsin a similar series of stages However trust inautomation can also follow an opposite patternin which faith is important early in the interac-tion followed by dependability and then bypredictability (Muir amp Moray 1996) Whenthe purpose of the automation was conveyedby a written description operators initially hada high level of purpose-level trust (Muir ampMoray) Similarly people seemed to trust at thelevel of faith when they perceived a specialisttelevision set (eg one that delivers only newscontent) as providing better content than a gen-eralist television set (eg one that can providea variety of content Nass amp Moon 2000)

These results contradict Muirrsquos (1987) pre-dictions but this can be resolved if the three

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 18: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 67

dimensions of trust are considered as differentlevels of attributional abstraction rather than asstages of development Attributions of trust canbe derived from a direct observation of systembehavior (performance) an understanding of theunderlying mechanisms (process) or from the in-tended use of the system (purpose) Early in therelationship with automation there may be littlehistory of performance but there may be a clearstatement regarding the purpose of the automa-tion Thus early in the relationship trust candepend on purpose and not performance Thespecific evolution depends on the type of infor-mation provided by the human-computer in-terface and the documentation and trainingTrust may first be based on faith or purpose andthen as experience increases operators maydevelop a feeling for the automationrsquos depend-ability and predictability (Hoc 2000) This isnot a fundamental difference compared withtrust between people but it reflects how peopleare often thrust into relationships with automa-tion in a way that forces trust to develop in away different from that in many relationshipsbetween people

A similar phenomenon occurs with tempo-rary groups in which trust must develop rapid-ly In these situations of ldquoswift trustrdquo the roleof dimensions underlying trust is not the sameas their role in more routine interpersonal rela-tionships (Meyerson Weick amp Kramer 1996)Likewise with virtual teams trust does notprogress as it does with traditional teams inwhich different types of trust emerge in stagesinstead all types of trust emerge at the beginningof the relationship (Coutu 1998) Interestinglyin a situation in which operators delegated con-trol to automation and to what the participantsthought were human collaborators trust evolvedin a similar manner and in both cases trustdropped quickly when the performance of thetrustee declined (Lewandowsky et al 2000)Like trust between people trust in automationdevelops according to the information availablerather than following a fixed series of stages

Substantial evidence suggests that trust inautomation is a meaningful and useful con-struct to understand reliance on automationhowever the differences in symmetry lack ofintentionality and differences in the evolutionof trust suggest that care must be exercised in

extrapolating findings from human-human trustto human-automation trust For this reasonresearch that considers the unique elements ofhuman-automation trust and reliance is need-ed and we turn to this next

A DYNAMIC MODEL OF TRUST ANDRELIANCE ON AUTOMATION

Figure 4 expands on the framework used tointegrate studies regarding trust between hu-mans and addresses the factors affecting trustand the role of trust in mediating reliance onautomation It complements the framework ofDzindolet et al (2002) which focuses on therole of cognitive social and motivational pro-cesses that combine to influence reliance Theframework of Dzindolet et al (2002) is partic-ularly useful in describing how the motivationalprocesses associated with expected effort com-plement cognitive processes associated withautomation biases to explain reliance issuesthat are distributed across the upper part ofFigure 4

As with the distinctions made by Bisantzand Seong (2001) and Riley (1994) this frame-work shows that trust and its effect on behaviordepend on a dynamic interaction among theoperator context automation and interfaceTrust combines with other attitudes (eg sub-jective workload effort to engage perceivedrisk and self-confidence) to form the intentionto rely on the automation For example ex-ploratory behavior in which people knowinglycompromise system performance to learn how itbehaves could influence intervention or delega-tion (Lee amp Moray 1994) Once the intention isformed factors such as time constraints and con-figuration errors may affect whether or not theperson actually relies on the automation Just asthe decision to rely on the automation dependson the context so too does the performance ofthat automation Environmental variability suchas weather conditions and a history of inade-quate maintenance can degrade the performanceof the automation and make reliance inappro-priate

Three critical elements of this frameworkare the closed-loop dynamics of trust and reli-ance the importance of the context on trustand on mediating the effect of trust on reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 19: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

68 Spring 2004 ndash Human Factors

and the role of information display on develop-ing appropriate trust The research addressingtrust between people provides a basis for under-standing how these three factors might influ-ence trust in automation

The Dynamics of Trust and Reliance

Figure 4 shows that trust and its effect on re-liance are part of a closed-loop process in whichthe dynamic interaction with the automationinfluences trust and trust influences the inter-action with the automation If the system is nottrusted it is unlikely to be used if it is not usedthe operator may have limited information re-garding its capabilities and it is unlikely thattrust will grow Because trust is largely basedon observation of the behavior of the automa-tion in most cases automation must be reliedupon for trust to grow (Muir amp Moray 1996)Relying on automation provides operators withan opportunity to observe how the automation

works and thus to develop greater trust Alsohighly trusted automation may be monitoredless frequently (Muir amp Moray 1996) and sotrust may continue to grow even if occasionalfaults occur

The type of automation may have a large ef-fect on the ability to observe the automation andon the dynamics of trust Four types of automa-tion have important implications for the dynam-ics of trust information acquisition informationanalysis decision selection and action imple-mentation (Parasuraman Sheridan amp Wickens2000) These types of automation vary greatlyregarding the observability of the automationwhen it is not being relied upon For example itis possible for operators to observe the behaviorof information acquisition automation evenwhen they are not relying on it as in automa-tion that cues potential targets (Yeh amp Wickens2001) Information acquisition automationeven makes it possible for users to dynamically

Figure 4 A conceptual model of the dynamic process that governs trust and its effect on reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 20: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 69

balance their attention between the informationfrom the automation and the raw data (Wick-ens Gempler amp Morphew 2000) In contrastit is not possible for operators to observe actionimplementation automation unless they arerelying on it as in automation that controls apump (Lee amp Moray 1994) This difficulty inobserving the automation after operators adoptmanual control may be one reason trust fails torecover fully when the reliability of the auto-mation improves (Lee amp Moray 1992) Therelative difficulty in observing action implemen-tation automation may also contribute to thehighly nonlinear patterns of reliance observedwith action implementation automation (Farrellamp Lewandowsky 2000)

Another way to distinguish types of automa-tion is whether the automation requires relianceor compliance (Meyer 2001) With reliance-oriented automation the automation indicateswhen the system is operating normally and theoperators act when the automation fails to pro-vide this indication The automation providesan implicit indicator of the need to act Withcompliance-oriented automation (eg a hazardwarning) the automation indicates an abnormalsituation and the operators act in response tothis indicator The automation provides an ex-plicit indicator of the need to act Meyer (2001)demonstrated the importance of this distinctionby showing that people were more cautiouswith compliance-oriented automation and thatintegrating compliance-oriented informationwith a monitored signal further increased theircaution

The dynamic interplay among changes intrust the effect of these changes on relianceand the resulting feedback can generate sub-stantial nonlinear effects that can lead peopleto adopt either fully automatic or fully manualcontrol (Lee amp Moray1994) The nonlinear cas-cade of effects noted in the dynamics of interper-sonal trust (Ostrom 1998) and the high degreeof volatility in the level of trust in some relation-ships suggest that a similar phenomenon occursbetween people (McKnight et al 1998) For ex-ample initial level of trust is important becauseit can frame subsequent interactions such asin resolving the ambiguity of E-mail messages(Coutu 1998) A similar effect is seen in whichthe initial expectations of perfect automation

performance can lead to disuse when peoplefind the automation to be imperfect (Dzindoletet al 2002) This nonlinear behavior may alsoaccount for the large individual differencesnoted in human-automation trust (Lee amp Moray1994 Moray Inagaki amp Itoh 2000)

A small difference in the predisposition totrust may have a substantial effect when it influ-ences an initial decision to engage the automa-tion Such minor variations in the initial levelof trust might lead one person to engage theautomation and another to adopt manual con-trol Reliance on the automation may then leadto a substantial increase in trust for the firstoperator whereas the trust of the second oper-ator might decline Characterizing the dynam-ics of the feedback loop shown in Figure 4 isessential to understanding why trust sometimesappears volatile and at other times stable Thisclosed-loop framework also highlights the im-portance of temporal specificity in characteriz-ing the appropriateness of trust The higher thetemporal specificity the more likely that trustwill correspond to the current capability of theautomation Poor temporal specificity will actas a time delay leading to an unstable patternof reliance and increasing the volatility of trustThe volatility of trust may depend on the char-acteristics of the operator the automation andthe context in which they interact through aclosed-loop feedback process rather than onany one of these three factors alone

Many studies have considered the effect offaults on trust in the automation This effect is best understood by considering the develop-ment and erosion of trust as a dynamic processTrust declines and then recovers with acutefailures and it declines with chronic failuresuntil operators understand the fault and learn toaccommodate it (Itoh Abe amp Tanaka1999 Leeamp Moray 1992) Chronic failures represent apermanent decline or deficiency in the capabilityof the automation (eg a software bug) whereasacute failures represent a transient failure afterwhich the capability of the automation returnsto its prefault capability (eg a maintenance-related failure that is repaired) The effect offaults on trust is not instantaneous faults causetrust to decline over time Likewise the recov-ery after faults is not instantaneous but occursover time (Lee amp Moray 1992)

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 21: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

70 Spring 2004 ndash Human Factors

A time-series analysis of this response sug-gests that a first-order differential equation canaccount for the change of trust over time Thismeans that a fault that affects operatorsrsquo trust inan automated system will influence trust overtime according to an exponential function Thelargest effect will be seen immediately with aresidual effect distributed over time The time-series analysis identifies the time constant oftrust which corresponds to the temporal speci-ficity of trust and determines how quickly thetrust changes to reflect changes in the capabili-ties of the automation This time-series equationaccounted for between 60 and 86 of thevariance in the overall patterns of operatorsrsquoreliance on automation (Lee amp Moray 1994)This analysis also shows that trust and reliancehas inertia which may be a critical element inunderstanding reliance on automation (Farrellamp Lewandowsky 2000 Lee amp Moray)

This dynamic approach to decision makingas defined by the decision to rely on the automa-tion differs from most approaches Most theo-ries and analytic techniques applied to relianceand decision making adopt a static approach(ie one that does not consider the passage oftime) Static approaches that have been usedto identify miscalibration of self-confidence indecision making address only cumulative expe-rience rather than the evolving experience andcontinuous recalibration that are critical for ap-propriate reliance on automation (LichtensteinFischhoff amp Phillips 1982 Wright Rowe Bol-ger ampGammack1994) For example traditionaldecision-making theories describe the outcomeof the decision process but not the vacillationthat accompanies most decision making (Buse-meyer amp Townsend 1993)

A dynamical approach may better reflect thefactors influencing reliance and their effectover time A dynamical approach also capital-izes on the power of differential equations torepresent continuous interaction among systemcomponents over time Such an approach sup-ports analyses that can define attractor fieldsstability metrics and phase transition points(van Gelder amp Port 1995) These concepts havesuccessfully described motor control (Kelso1995 Shaw amp Kinsella-Shaw 1988) but havebeen applied only recently to cognitive and socialbehavior (Beer 2001 Thelen Schoner Scheier

amp Smith 2001) This approach may be usefulin understanding trust Specifically a dynamicalapproach can describe new types of trust miscal-ibration ndash for example hysteresis and enhancedcontrast Hysteresis constitutes a delayed re-sponse that depends on the direction of thechange Enhanced contrast is an acceleratedresponse that also depends on the direction ofthe change Hence a dynamical approach maylead to new analysis methods to describe thedevelopment and erosion of trust

Context Reliability Reliance and Trust

Figure 4 shows that reliance depends on manyfactors in addition to trust particularly thoseaffecting the capability of the automation andthat trust depends on the context Identifyinghow these factors interact with trust to affectreliance can help to reconcile conflicting litera-ture and support more robust design guidanceFor example the level of trust combines withother attitudes and expectations such as subjec-tive workload and self-confidence to determinethe intention to rely on the automation A vari-ety of system and human performance con-straints also affect how the intention to rely onautomation translates into actual reliance Thedemands of configuring and engaging automa-tion may make reliance inappropriate evenwhen trust is high and the automation is verycapable (Kirlik 1993) For example the opera-tor may intend to use the automation but nothave sufficient time to engage it Because ofthis some propose that automation should havethe final authority for decisions and actions thatmust be allocated to automation in time-criticalsituations (Moray et al 2000) Understandingthe role of trust in the decision to rely on auto-mation requires a consideration of the operatingcontext that goes beyond trust alone

A particularly important variable that interactswith trust to influence reliance is self-confidenceSelf-confidence is a particularly critical factor indecision making in general (Bandura1982 Gistamp Mitchell 1992) and in mediating the effectof trust on reliance in particular (Lee amp Moray1994) When operatorsrsquo self-confidence is highand trust in the system is low they are more in-clined to rely on manual control The oppositeis also true Low self-confidence is related to agreater inclination to rely on the automatic

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 22: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 71

controller (Lee amp Moray 1994) A study of avehicle navigation aid found that system errorscause trust and reliance to decline more forthose in a familiar city whose self-confidencewas high as compared with those in an unfamil-iar city whose self-confidence was low (Kanto-witz et al 1997a) In a comparison of studentsand pilots students tended to have greater self-confidence and were less inclined to allocatetasks to automation which they tended to dis-trust These results were attributed to the pilotsrsquogreater familiarity with the automation (Riley1996) A similar effect was noted in a situationwhere people had greater confidence in theirown ability than in the automation even thoughthe automation performance was based on theirown behavior and had exactly the same levelof performance (Liu Fuld amp Wickens 1993)Biases in self-confidence can have a substantialeffect on the appropriate reliance on automation

The multitasking demands of a situation canalso interact with trust to influence reliance Asituation in which an operator has highly reli-able automation combined with the responsibil-ity for multiple tasks in addition to monitoringthe automation can lead to overtrust in automa-tion and undermine the detection of automationfailures (Parasuraman Molloy amp Singh 1993)In contrast when the operatorsrsquo only task wasto monitor the automation they detected almostall failures (Thackray amp Touchstone 1989)Because these findings might be attributed toinexperienced participants or an unrealisticallyhigh failure rate the experiment was repeatedusing highly experienced pilots (ParasuramanMouloua amp Molloy 1994) and infrequent faults(Molloy amp Parasuraman 1996) The findingswere the same A similar pattern of results oc-curs when highly reliable systems appear toreinforce the perception that other cues are re-dundant and unnecessary and that the task canbe completely delegated to the automation (Mo-sier Skitka Heers amp Burdick 1998)

Excessive trust and a cycle of greater trustleading to less vigilant monitoring and greatertrust may explain why monitoring of automationin a multitask environment is imperfect Anotherexplanation is that operators adopt a eutacticmonitoring pattern that can be considered opti-mal because it balances the costs of monitoringand the probability of failures (Moray 2003)

Similar to probability matching in signal detec-tion theory it may be appropriate to rely on im-perfect automation even though it will lead tosome incorrect outcomes (Wickens et al 2000)

One approach to reduce overtrust in theautomation and increase detection of failures isthrough adaptive automation which shifts be-tween manual and automatic control accordingto the capabilities of the person and the situa-tion In one study participants monitored anautomated engine status display while perform-ing a tracking and fuel management task Thismultitask flight simulation included adaptiveautomation that returned the engine statusmonitoring task to the person for 10 min in onecondition and in another condition the moni-toring task was automated during the wholeexperiment The 10 min in manual control sub-stantially improved subsequent detection ofautomation failures (Parasuraman Mouloua ampMolloy 1996) Passive monitoring combinedwith the responsibility for other tasks seems toincrease reliance on the automation

Faults with automation and environmentalsituations that cause automation to performpoorly are fundamental factors affecting trustand the disuse of automation Not surprisinglymany studies have shown that automation faultscause trust to decline Importantly this effectis often specific to the automatic controllerwith the fault A fault in the automation didnot cause trust to decline in other similar butfunctionally independent automatic controllers(Lee amp Moray 1992 Muir amp Moray 1996) Amultitrait-multimethod analysis also showedthat trust was specific to the particular automat-ic controller and independent of self-confidence(Lee amp Moray 1994) These results indicatethat the functional specificity of trust is not lim-ited to the system as a whole but can be linkedto particular controllers The exact degree offunctional specificity depends on the informa-tion presented to operators and on their goalsin controlling the system

In a similar study participants rated theirtrust in a pasteurization system on three subjec-tive scales corresponding to the purpose pro-cess and product dimensions of trust Changesin the level of system performance accountedfor a greater percentage of variance in trust rat-ings associated with the performance dimension

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 23: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

72 Spring 2004 ndash Human Factors

than in the ratings associated with process orpurpose The rating that corresponded to theprocess dimension was most influenced by the presence or absence of a fault as comparedwith the process or purpose dimensions (Lee ampMoray 1992) These results show that trust re-flects the effects of faults in a way that is consis-tent with the theoretical expectations outlined inFigure 4

Trust is a nonlinear function of automationperformance and the dynamic interaction be-tween the operator and the automation It tendsto be conditioned by the worst behaviors of anautomated system (Muir amp Moray 1996) whichis also the case with people Negative interac-tions have a greater influence than do positiveinteractions (Kramer 1999) In addition ini-tial experience has a lasting effect on trust Aninitially low level of reliability leads to lowertrust and reliance when reliability subsequent-ly improves Likewise trust is more resilient ifautomation reliability starts high and declinesthan if it starts low and increases (Fox amp Boehm-Davis 1998) This effect may reflect nonlinearfeedback mechanisms in which a distrustedsystem is not used and so becomes even lesstrusted and less used

Beyond the effect of initial levels of trusttrust decreases with decreasing reliability butsome evidence suggests that below a certainlevel of reliability trust declines quite rapidlyThe absolute level of this drop-off seems to behighly system and context dependent with esti-mates ranging from 90 (Moray et al 2000)and 70 (Kantowitz et al 1997a) to 60(Fox 1996) Another consistent finding is thatthe effect of an automation fault on trust de-pends as much on its predictability as on itsmagnitude A small fault with unpredictableresults affected trust more than did a large faultof constant error (Moray et al Muir amp Moray1996) Trust can develop when a systematicfault occurs for which a control strategy can bedeveloped Consistent with this result relianceon automation follows the level of trust and peo-ple rely on faulty automation if they have priorknowledge of the fault (Riley 1994) Similarlya discrepancy between the operatorrsquos expecta-tions and the behavior of the automation canundermine trust even when the automation per-forms well (Rasmussen et al 1994) No single

level of reliability can be identified that will leadto distrust and disuse instead trust depends onthe timing consequence and expectations asso-ciated with failures of the automation

The environmental context not only influ-ences trust and reliance it can also influence theperformance of the automation The automa-tion may perform well in certain circumstancesand not in others For this reason appropriate-ness of trust often depends on how sensitivepeople are to the influence of the environmenton the performance of the automation Preciselyresolving differences in the context can lead tomore appropriate trust (Cohen 2000) For ex-ample Masalonis (2000) described how train-ing enhanced the appropriateness of trust in thecontext of situation-specific reliability of deci-sion aids for air traffic controllers

A similar benefit was found in a situation inwhich aircraft predictor information was onlypartially reliable Knowing that the predictorwas not completely reliable helped pilots to cal-ibrate their trust and adopt an appropriate allo-cation of attention between the raw data andthe predictor information (Wickens et al 2000)Similarly Bisantz and Seong (2001) showedthat failures attributable to different causessuch as intentional sabotage versus hardware orsoftware failures have different effects on trustThus for some systems it may be useful to dis-criminate between perturbations driven by en-emy intent and undirected variations Theseresults suggest that trust is more than a simplereflection of the performance of the automationappropriate trust depends on the operatorsrsquounderstanding of how the context affects thecapability of the automation

Effects of Display Content and Format onTrust

Trust evolves through analytic- analogical-and affect-based interpretations of informationregarding the capabilities of the automationBecause direct observation of the automation isoften impractical or impossible perception ofthe automation-related information is usuallymediated by a display This suggests that the ap-propriateness of trust ndash that is the match betweenthe trust and the capabilities of the automation ndashdepends on the content and format of the dis-play Information regarding the capabilities of

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 24: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 73

the automation can be defined along the dimen-sions of detail and abstraction The dimensionof abstraction refers to information regardingthe performance process and purpose of theautomation The dimension of detail influencesthe functional specificity of the trust whethertrust is focused on the mode of the automaticcontroller the automatic controller as a wholeor a group of automatic controllers Organizingthis information in the display in a way thatsupports analytic- analogical- and affect-basedassimilation of this information may be an im-portant means of guiding appropriate expecta-tions regarding the automation If the informationis not available in the display or if it is formattedimproperly trust may not develop appropriately

The content and format of the interface havea powerful effect on trust Substantial researchin this area has focused on trust in Internet-based interactions (for a review see CorritoreKracher amp Wiedenbeck 2003a) In many casestrust and credibility depend on surface featuresof the interface that have no obvious link to thetrue capabilities of the system (Briggs Burfordamp Dracup 1998 Tseng amp Fogg 1999) In anon-line survey of more than 1400 people FoggMarshall Laraki et al (2001) found that forWeb sites credibility depends heavily on ldquoreal-world feelrdquo which is defined by factors such asspeed of response listing a physical address andincluding photos of the organization Similarlya formal photograph of the author enhancedtrustworthiness of a research article whereasan informal photograph decreased trust (FoggMarshall Kameda et al 2001) Visual designfactors of the interface such as cool colors anda balanced layout can also induce trust (Kim ampMoon 1998) Similarly trusted Web sites tendto be text based use empty space as a structuralelement have strictly structured grouping anduse real photographs (Karvonen amp Parkkinen2001)

These results show that trust tends to increasewhen information is displayed in a way that pro-vides concrete details that are consistent andclearly organized However caution is neededwhen adding photographs and other featuresto enhance trust These features might inadver-tently degrade trust because they can undermineusability and can be considered manipulativeby some (Riegelsberger amp Sasse 2002)

A similar pattern of results appears in studiesof automation that supports target detectionIncreasing image realism increased trust and ledto greater reliance on the cueing information(Yeh amp Wickens 2001) Similarly the tendencyof pilots to blindly follow the advice of the sys-tem increased when the aid included detailedpictures as a guide (Ockerman 1999) Just ashighly realistic images can increase trust degrad-ed imagery can decrease trust as was shown ina target cueing situation (MacMillan Entin ampSerfaty 1994) Adjusting image quality andadding information to the interface regardingthe capability of the automation can promoteappropriate trust In a signal detection task thereliability of the sources was coded with differ-ent levels of luminance leading participants toweigh reliable sources more than unreliableones (Montgomery amp Sorkin 1996) Likewiseincreasing warning urgency increased compli-ance even when the participants were subject toa high rate of false alarms (Bliss et al 1995)Although reliance and trust depend on the sup-porting information provided by the systemthe amount of information must be tailored to the available decision time (Entin Entin ampSerfaty 1996) These results suggest that theparticular interface form ndash in particular the em-phasis on concrete realistic representations ndashcan increase the level of trust

Similarly trust between people builds rapidlywith face-to-face communication but not withtext-only communication Interestingly trust es-tablished in face-to-face communication trans-ferred to subsequent text-only communication(Zheng Bos Olson amp Olson 2001) Cassell andBickmore (2000) suggested that creating a com-puter that is a conversational partner will inducepeople to trust the system by providing the samesocial cues that people use in face-to-face con-versation To maximize trust designers shouldselect speech parameters (eg words per min-ute intonation and frequency range) that createa personality consistent with the user and thecontent being presented (Nass amp Lee 2001) Inaddition the speech should be consistent Astudy of a speech interface showed that peoplewere more trusting of a system that used syn-thetic speech consistently as compared with onethat used a combination of synthetic and humanspeech(Gong Nass SimardampTakhteyev2001)

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 25: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

74 Spring 2004 ndash Human Factors

These results suggest that it may be possibleto use degrees of synthetic speech to calibratetrust in a manner similar to the way degradedimages can calibrate trust However care mustbe taken in adopting speech interactions becausesubtle differences may have important effectson trust and reliance People may comply withcommand-based messages in a way that theymight not with informational messages (Crocollamp Curry 1990 Lee et al 1999) More gener-ally using speech to create a conversationalpartner as suggested by Cassell and Bickmore(2000) may lead people to attribute humancharacteristics to the automation in such a wayas to induce false expectations that could leadto inappropriate trust

In summary the effect of subtle cues that areoften associated with concrete examples andphysical interactions may reflect a more gener-al finding that specific instances and salientimages invoke affective responses more power-fully than do abstract data (Finucane Alha-kami Slovic amp Johnson 2000 Loewenstein et al 2001) Situations described in terms offrequency counts vivid images and specificinstances lead to different assessments of riskas compared with when data are presented ascontext-free percentages and probabilities (Slov-ic Finucane Peters amp McGregor in press)Affect-oriented display characteristics that influ-ence risk judgments may have a similar powerto influence trust

IMPLICATIONS FOR CREATINGTRUSTABLE AUTOMATION

Design errors maintenance problems andunanticipated variability make completely reli-able and trustworthy automation unachievableFor this reason creating highly trustable automa-tion is important Trustable automation supportsadaptive reliance on the automation throughhighly calibrated high-resolution and high-specificity trust Appropriate trust can lead toperformance of the joint human-automationsystem that is superior to the performance ofeither the human or the automation alone (Sor-kin amp Woods 1985 Wickens et al 2000)The conceptual model in Figure 4 provides atheoretical basis to identify tentative designevaluation and training guidance and direc-

tions for future research that can lead to moretrustable automation

Make Automation Trustable

Appropriate trust and reliance depend onhow well the capabilities of the automation areconveyed to the user This can be done by mak-ing the algorithms of the automation simpleror by revealing their operation more clearlySpecific design evaluation and training con-siderations include the following

bull Design for appropriate trust not greater trust bull Show the past performance of the automationbull Show the process and algorithms of the automa-

tion by revealing intermediate results in a way thatis comprehensible to the operators

bull Simplify the algorithms and operation of the au-tomation to make it more understandable

bull Show the purpose of the automation design basisand range of applications in a way that relates tothe usersrsquo goals

bull Train operators regarding its expected reliabilitythe mechanisms governing its behavior and itsintended use

bull Carefully evaluate any anthropomorphizing ofthe automation such as using speech to create asynthetic conversational partner to ensure appro-priate trust

Substantial research issues face the designof trustable automation First little work hasaddressed how interface features influenceaffect Recent neurological evidence suggeststhe existence of structurally separate pathwaysfor affective and analytic responses and thisdifference may imply equally separate interfaceconsiderations For example music and prosodymay be particularly adept at influencing affec-tive responses (Panksepp amp Bernatzky 2002)and so sonification (Brewster 1997) rather thanvisualization may be an effective way to cali-brate trust Another important research issueis the trade-off between trustworthy automa-tion and trustable automation Trustworthyautomation is automation that performs effi-ciently and reliably Achieving this performancesometimes requires very complex algorithmsthat can be extremely hard to understand Tothe extent that system performance depends onappropriate trust there may be some circum-stances in which making automation simplerbut less capable outweighs the benefits of mak-ing it more complex and trustworthy but less

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 26: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 75

trustable The trade-off between trustworthyand trustable automation and the degree towhich interface design and training can miti-gate this trade-off merit investigation

Relate Context to Capability of theAutomation

Appropriate trust depends on the operatorrsquosunderstanding of how the context affects thecapability of the automation Specific designevaluation and training considerations includethe following

bull Reveal the context and support the assessmentand classification of the situations relative to thecapability of the automation

bull Show how past performance depends on situa-tions

bull Consider how the context influences the relevanceof trust the final authority for some time-criticaldecisions should be allocated to automation

bull Evaluate the appropriateness of trust according tocalibration resolution and specificity Specificityand resolution are particularly critical when theperformance of the automation is highly contextdependent

bull Training to promote appropriate trust shouldaddress how situations interact with the charac-teristics of the automation to affect its capability

Little research has addressed the challengesof promoting appropriate trust in the face of adynamic context that influences its capabilityThe concept of ecological interface design hasled to a substantial research effort that has fo-cused on revealing the physical and intentionalconstraints of the system being controlled (Ras-mussen amp Vicente 1989 Vicente 2002 Vicenteamp Rasmussen 1992) This approach could beviewed as conveying the context to the operatorbut no research has considered how to relatethis representation to one that describes thecapabilities of the automation Several research-ers have acknowledged the importance of con-text in developing appropriate trust (Cohen etal 1999 Masalonis 2000) but few have con-sidered the challenge of integrating ecologicalinterface design with interface design for auto-mation (Furukawa amp Inagaki 1999 2001) Im-portant issues regarding this integration includethe links among the three levels of attributionalabstraction (purpose process and performance)and the dimension of detail which describethe characteristics of the automation and the

abstraction hierarchy Another important con-sideration is the link between the interface im-plications of skill- rule- and knowledge-basedperformance and those of the analytic analogi-cal and affective processes that govern trust

Consider Cultural Organizational andTeam Interactions

Trust and reliance depend on organizationaland cultural factors that go beyond the individ-ual operator working with automation Ex-trapolation from the relatively little research inhuman-automation interactions and the con-siderable research base on organization andinterpersonal trust suggests the following de-sign evaluation and training considerations

bull Consider the organizational context and the indi-rect exposure to automation that it facilitates

bull Consider individual and cultural differences inevaluations because they may influence reliancein ways that are not directly related to the char-acteristics of the automation

bull Cultural differences regarding expectations of theautomation can be a powerful force that can leadto misuse and disuse unless addressed by appro-priate training

bull Gossip and informal discussion of automationcapabilities need to be considered in training andretraining so that the operatorsrsquo understanding ofthe automation reflects its true capabilities

Very little research has considered how indi-vidual and cultural differences influence trustand reliance Of the few studies that have con-sidered these issues the findings suggest thatindividual and cultural differences can influencehuman-automation interactions in unexpectedways and merit further investigation particular-ly in the context of extrapolating experimentaldata Another important consideration is therole of team and organizational structure onthe diffusion of trust among coworkers Under-standing how communication networks com-bine with the frequency of direct interaction withthe automation and its reliability could haveimportant implications for supporting appropri-ate trust Research has not yet considered theevolution of trust in multiperson groups thatshare responsibility for managing automationIn this situation people must develop appropri-ate trust not only in the automation but also inthe other people who might assume manual con-trol Even more challenging is the development

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 27: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

76 Spring 2004 ndash Human Factors

of appropriate meta trust in the other peopleMeta trust refers to the trust a person has thatthe other personrsquos trust in the automation isappropriate Multiperson computer-mediatedmanagement of automation represents an unex-plored area with considerable implications formany emerging systems

CONCLUSION

ldquoEmotion is critical for the appropriate direc-tion of attention since it provides an automatedsignal about the organismrsquos past experiencewith given objects and thus provides a basis forassigning or withholding attention relative to agiven objectrdquo (Damasio 1999 p 273)

Trust influences reliance on automationhowever it does not determine reliance As withother psychological constructs that explain be-havior it is important to identify the boundaryconditions within which trust makes a differ-ence For trust the boundary conditions includesituations in which uncertainty and complexitymake an exhaustive evaluation of options im-practical Trust is likely to influence reliance oncomplex imperfect automation in dynamic envi-ronments that require the person to adapt tounanticipated circumstances More generallytrust seems to be an example of how affect canguide behavior when rules fail to apply andwhen cognitive resources are not available tosupport a calculated rational choice (Barley1988) The conceptual model developed in thispaper identifies some of the factors that interactwith trust to influence reliance By helping todefine boundary conditions regarding the influ-ence of trust this model may lead to more pre-cise hypotheses more revealing measures oftrust and design approaches that promote moreappropriate trust

Trust is one example of the important influ-ence of affect and emotions on human-technologyinteraction Emotional response to technology isnot only important for acceptance it can alsomake a fundamental contribution to safety andperformance This is not a new realization asNeisser stated ldquohuman thinking begins in anintimate association with emotions and feelingswhich is never entirely lostrdquo (as quoted in Si-mon 1967 p 29) However little systematic re-search has addressed how these considerations

should influence design and modeling of human-technology interaction Trust is one example ofhow affect-related considerations should influ-ence the design of complex high-consequencesystems

Because automation and computer technol-ogy are growing increasingly complex the im-portance of affect and trust is likely to growComputer technology allows people to developrelationships and collaborate with little or nodirect contact but these new opportunities pro-duce complex situations that require appropri-ate trust As computer technology grows morepervasive trust is also likely to become a criticalfactor in consumer products such as home au-tomation personal robots and automotive auto-mation Designing trustable technology may be acritical factor in the success of the next genera-tion of automation and computer technology

ACKNOWLEDGMENTS

This work was supported by the National Sci-ence Foundation under Grant No 0117494 Thismanuscript was greatly improved by the com-ments on earlier versions from Amy BisantzJohn Hajdukiewicz Jenna Hetland Greg Jamie-son Robin Mitchell Raja Parasuraman ChrisMiller Michelle Reyes Kim Vicente ChristopherWickens and an anonymous reviewer Specialthanks to Neville Moray for his inspiration andinsight

REFERENCES

Ajzen I amp Fishbein M (1980) Understanding attitudes and pre-dicting social behavior Upper Saddle River NJ Prentice Hall

Baba M L (1999) Dangerous liaisons Trust distrust and infor-mation technology in American work organizations HumanOrganization 58 331ndash346

Baba M L Falkenburg D R amp Hill D H (1996) Technologymanagement and American culture Implications for businessprocess redesign Research Technology Management 39(6)44ndash54

Bandura A (1982) Self-efficacy mechanism in human agencyAmerican Psychologist 37 122ndash147

Barber B (1983) The logic and limits of trust New BrunswickNJ Rutgers University Press

Barley S R (1988) The social construction of a machine Ritualsuperstition magical thinking and other pragmatic responsesto running a CT scanner In M Lock amp D R Gordon (Eds)Biomedicine examined (pp 497ndash539) Dordrecht NetherlandsKluwer Academic

Bechara A Damasio A R Damasio H amp Anderson S W(1994) Insensitivity to future consequences following damageto human prefrontal cortex Cognition 50 7ndash15

Bechara A Damasio H Tranel D amp Anderson S W (1998)Dissociation of working memory from decision making withinthe human prefrontal cortex Journal of Neuroscience 18428ndash437

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 28: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 77

Bechara A Damasio H Tranel D amp Damasio A R (1997)Deciding advantageously before knowing the advantageousstrategy Science 275(5304) 1293ndash1295

Beer R D (2001) Dynamical approaches in cognitive scienceTrends in Cognitive Sciences 4 91ndash99

Berscheid E (1994) Interpersonal relationships Annual Reviewof Psychology 45 79ndash129

Bhattacharya R Devinney T M amp Pillutla M M (1998) A for-mal model of trust based on outcomes Academy of Manage-ment Review 23 459ndash472

Bisantz A M amp Seong Y (2001) Assessment of operator trustin and utilization of automated decision-aids under differentframing conditions International Journal of Industrial Ergo-nomics 28(2) 85ndash97

Bliss J Dunn M amp Fuller B S (1995) Reversal of the cry-wolfeffect ndash An investigation of two methods to increase alarmresponse rates Perceptual and Motor Skills 80 1231ndash1242

Brehmer B amp Dorner D (1993) Experiments with computer-simulated microworlds Escaping both the narrow straits of thelaboratory and the deep blue sea of the field-study Computersin Human Behavior 9 171ndash184

Brewster S A (1997) Using non-speech sound to overcome infor-mation overload Displays 17 179ndash189

Briggs P Burford B amp Dracup C (1998) Modeling self-confidence in users of a computer-based system showingunrepresentative design International Journal of Human-Computer Studies 49 717ndash742

Burt R S amp Knez M (1996) Trust and third-party gossip In RM Kramer amp T R Tyler (Eds) Trust in organizations Frontiersof theory and research (pp 68ndash89) Thousand Oaks CA Sage

Busemeyer J R amp Townsend J T (1993) Decision field theoryA dynamic cognitive approach to decision making in an uncer-tain environment Psychological Review 100 432ndash459

Butler J K amp Cantrell R S (1984) A behavioral decision theoryapproach to modeling trust in superiors and subordinatesPsychological Reports 55 19ndash28

Case K Sinclair M A amp Rani M R A (1999) An experimentalinvestigation of human mismatches in machining Proceedingsof the Institution of Mechanical Engineers Part B ndash Journal ofEngineering Manufacture 213 197ndash201

Cassell J amp Bickmore T (2000) External manifestations of trust-worthiness in the interface Communications of the ACM43(12) 50ndash56

Castelfranchi C (1998) Modelling social action for AI agentsArtificial Intelligence 103 157ndash182

Castelfranchi C amp Falcone R (2000) Trust and control Adialectic link Applied Artificial Intelligence 14 799ndash823

Cohen M S (2000) Training for trust in decision aids with appli-cations to the rotorcraftrsquos pilotrsquos associate Presented at theInternational Conference on Human Performance SituationAwareness and Automation Savannah GA Abstract retrievedMarch 8 2004 from httpwwwcog-techcompapersTrustAdaptive20Automation20abstract20finalpdf

Cohen M S Parasuraman R amp Freeman J (1999) Trust indecision aids A model and its training implications (TechReport USAATCOM TR 97-D-4) Arlington VA CognitiveTechnologies

Conejo R A amp Wickens C D (1997) The effects of highlightingvalidity and feature type on air-to-ground target acquisitionperformance (ARL-97-11NAWC-ONR-97-1) Savoy ILAviation Research Laboratory Institute for Aviation

Cook J amp Wall T (1980) New work attitude measures of trustorganizational commitment and personal need non-fulfillmentJournal of Occupational Psychology 53 39ndash52

Corritore C L Kracher B amp Wiedenbeck S (2001a MarchndashApril) Trust in the online environment Workshop presentedat the CHI 2001 Conference on Human Factors in ComputingSystems Seattle WA

Corritore C L Kracher B amp Wiedenbeck S (2001b) Trust inthe online environment In M J Smith G Salvendy D Harrisamp R J Koubek (Eds) Usability evaluation and interfacedesign Cognitive engineering intelligent agents and virtualreality (pp 1548-1552) Mahwah NJ Erlbaum

Corritore C L Kracher B amp Wiedenbeck S (2003a) On-linetrust Concepts evolving themes a model International Journalof Human-Computer Studies 58 737ndash758

Corritore C L Kracher B amp Wiedenbeck S (Eds) (2003b)Trust and technology [Special issue] International Journal ofHuman-Computer Studies 58(6)

Couch L L amp Jones W H (1997) Measuring levels of trustJournal of Research in Personality 31(3) 319ndash336

Coutu D L (1998) Trust in virtual teams Harvard BusinessReview 76 20ndash21

Crocoll W M amp Coury B G (1990) Status or recommendationSelecting the type of information for decision aiding InProceedings of the Human Factors Society 34th AnnualMeeting (pp 1524ndash1528) Santa Monica CA Human Factorsand Ergonomics Society

Crossman E R F W (1974) Automation and skill In E Edwardsamp F P Lees (Eds) The human operator in process control(pp 1ndash24) London Taylor amp Francis

Damasio A R (1996) The somatic marker hypothesis and thepossible functions of the prefrontal cortex PhilosophicalTransactions of the Royal Society of London Series B ndash Bio-logical Sciences 351 1413ndash1420

Damasio A R (1999) The feeling of what happens Body andemotion in the making of consciousness New York Harcourt

Damasio A R (2003) Looking for Spinoza Joy sorrow and thefeeling brain New York Harcourt

Damasio A R Tranel D amp Damasio H (1990) Individualswith sociopathic behavior caused by frontal damage fail torespond autonomically to social-stimuli Behavioural BrainResearch 41 81ndash94

Dassonville I Jolly D amp Desodt A M (1996) Trust betweenman and machine in a teleoperation system Reliability Engi-neering and System Safety 53 319ndash325

Deutsch M (1958) Trust and suspicion Journal of ConflictResolution 2 265ndash279

Deutsch M (1960) The effect of motivational orientation upontrust and suspicion Human Relations 13 123ndash139

Doney P M Cannon J P amp Mullen M R (1998) Understandingthe influence of national culture on the development of trustAcademy of Management Review 23 601ndash620

Dzindolet M T Pierce L G Beck H P amp Dawe L A (2002)The perceived utility of human and automated aids in a visualdetection task Human Factors 44 79ndash94

Dzindolet M T Pierce L G Beck H P Dawe L A ampAnderson B W (2001) Predicting misuse and disuse of com-bat identification systems Military Psychology 13 147ndash164

Entin E B Entin E E amp Serfaty D (1996) Optimizing aidedtarget-recognition performance In Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting (pp233ndash237) Santa Monica CA Human Factors and ErgonomicsSociety

Farrell S amp Lewandowsky S (2000) A connectionist model ofcomplacency and adaptive recovery under automation Journalof Experimental Psychology ndash Learning Memory and Cognition26 395ndash410

Fine G A amp Holyfield L (1996) Secrecy trust and dangerousleisure Generating group cohesion in voluntary organizationsSocial Psychology Quarterly 59 22ndash38

Finucane M L Alhakami A Slovic P amp Johnson S M (2000)The affect heuristic in judgments of risks and benefits Journalof Behavioral Decision Making 13 1ndash17

Fishbein M amp Ajzen I (1975) Belief attitude intention andbehavior Reading MA Addison-Wesley

Fogg B Marshall J Kameda T Solomon J Rangnekar ABoyd J et al (2001) Web credibility research A method foronline experiments and early study results In CHI 2001Conference on Human Factors in Computing Systems (pp293ndash294) New York Association for Computing Machinery

Fogg B Marshall J Laraki O Osipovich A Varma C FangN et al (2001) What makes Web sites credible A report ona large quantitative study In CHI 2001 Conference on HumanFactors in Computing Systems (pp 61ndash68) New York Asso-ciation for Computing Machinery

Fox J M (1996) Effects of information accuracy on user trustand compliance In CHI 1996 Conference on Human Factorsin Computing Systems (pp 35-36) New York Association forComputing Machinery

Fox J M amp Boehm-Davis D A (1998) Effects of age and con-gestion information accuracy of advanced traveler informationon user trust and compliance Transportation Research Record1621 43ndash49

Furukawa H amp Inagaki T (1999) Situation-adaptive interfacebased on abstraction hierarchies with an updating mechanismfor maintaining situation awareness of plant operators InProceedings of the 1999 IEEE International Conference on

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 29: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

78 Spring 2004 ndash Human Factors

Systems Man and Cybernetics (pp 693ndash698) Piscataway NJ IEEE

Furukawa H amp Inagaki T (2001) A graphical interface designfor supporting human-machine cooperation Representingautomationrsquos intentions by means-ends relations In Proceedingsof the 8th IFACIFIPIFORSIEA Symposium on AnalysisDesign and Evaluation of Human-Machine Systems (pp573ndash578) Laxenburg Austria International Federation ofAutomatic Control

Gabarro J J (1978) The development of trust influence and ex-pectations In A G Athos amp J J Gabarro (Eds) Interpersonalbehavior Communication and understanding in relationships(pp 290ndash230) Englewood Cliffs NJ Prentice Hall

Gaines S O Panter A T Lyde M D Steers W N Rusbult CE Cox C L et al (1997) Evaluating the circumplexity ofinterpersonal traits and the manifestation of interpersonal traitsin interpersonal trust Journal of Personality and SocialPsychology 73 610ndash623

Gist M E amp Mitchell T R (1992) Self-efficacy ndash A theoreticalanalysis of its determinants and malleability Academy ofManagement Review 17 183ndash211

Gong L Nass C Simard C amp Takhteyev Y (2001) Whennon-human is better than semi-human Consistency in speechinterfaces In M J Smith G Salvendy D Harris amp R Koubek(Eds) Usability evaluation and interface design Cognitiveengineering intelligent agents and virtual reality (pp 1558ndash1562) Mahwah NJ Erlbaum

Grice H P (Ed) (1975) Logic and conversation Vol 3 Speechacts New York Academic

Gurtman M B (1992) Trust distrust and interpersonal prob-lems A circumplex analysis Journal of Personality and SocialPsychology 62 989ndash1002

Halprin S Johnson E amp Thornburry J (1973) Cognitive relia-bility in manned systems IEEE Transactions on Reliability R-22 165-169

Hardin R (1992) The street-level epistemology of trust Politicsand Society 21 505ndash529

Hoc J M (2000) From human-machine interaction to human-machine cooperation Ergonomics 43 833ndash843

Holmes J G (1991) Trust and the appraisal process in close rela-tionships In W H Jones amp D Perlman (Eds) Advances inpersonal relationships (Vol 2 pp 57-104) London JessicaKingsley

Hovland C I Janis I L amp Kelly H H (1953) Communicationand persuasion Psychological studies of opinion change NewHaven CT Yale University Press

Huang H Keser C Leland J amp Shachat J (2002) Trust theInternet and the digital divide (RC22511) Yorktown HeightsNY IBM Research Division

Inagaki T Takae Y amp Moray N (1999) Decision supported in-formation for takeoff safety in human-centered automation Anexperimental investigation of time-fragile characteristics InProceedings of the IEEE International Conference on SystemsMan and Cybernetics (Vol 1 pp 1101ndash1106) Piscataway NJIEEE

Itoh M Abe G amp Tanaka K (1999) Trust in and use of auto-mation Their dependence on occurrence patterns of malfunc-tions In Proceedings of the IEEE International Conference onSystems Man and Cybernetics (Vol 3 pp 715ndash720) Piscata-way NJ IEEE

Janis I L amp Mann L (1977) Decision making A psychologicalanalysis of conflict choice and commitment New York Free

Jennings E E (1967) The mobile manager A study of the newgeneration of top executives Ann Arbor MI Bureau of In-dustrial Relations University of Michigan

Johns J L (1996) A concept analysis of trust Journal of Ad-vanced Nursing 24 76ndash83

Johnson-Laird P N amp Oately K (1992) Basic emotions ratio-nality and folk theory Cognition and Emotion 6 201ndash223

Jones G R amp George J M (1998) The experience and evolutionof trust Implications for cooperation and teamwork Academyof Management Review 23 531ndash546

Kantowitz B H Hanowski R J amp Kantowitz S C (1997a)Driver acceptance of unreliable traffic information in familiarand unfamiliar settings Human Factors 39 164ndash176

Kantowitz B H Hanowski R J amp Kantowitz S C (1997b)Driver reliability requirements for traffic advisory informationIn Y I Noy (Ed) Ergonomics and safety of intelligent driverinterfaces (pp 1ndash22) Mahwah NJ Erlbaum

Karvonen K (2001) Designing trust for a universal audience Amulticultural study on the formation of trust in the internet inthe Nordic countries In C Stephanidis (Ed) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 3 pp 1078ndash1082) Mahwah NJ Erlbaum

Karvonen K amp Parkkinen J (2001) Signs of trust A semioticstudy of trust formation in the Web In M J Smith GSalvendy D Harris amp R J Koubek (Eds) First InternationalConference on Universal Access in Human-Computer Inter-action (Vol 1 pp 1076ndash1080) Mahwah NJ Erlbaum

Kee H W amp Knox R E (1970) Conceptual and methodologicalconsiderations in the study of trust and suspicion ConflictResolution 14 357ndash366

Kelso J A S (1995) Dynamic patterns Self-organization of brainand behavior Cambridge MA MIT Press

Kikuchi M Wantanabe Y amp Yamasishi T (1996) Judgmentaccuracy of otherrsquos trustworthiness and general trust An exper-imental study Japanese Journal of Experimental SocialPsychology 37 23ndash36

Kim J amp Moon J Y (1998) Designing towards emotional usabil-ity in customer interfaces ndash Trustworthiness of cyber-bankingsystem interfaces Interacting With Computers 10 1ndash29

Kirlik A (1993) Modeling strategic behavior in human-automa-tion interaction Why an ldquoaidrdquo can (and should) go unusedHuman Factors 35 221ndash242

Klein G A (1989) Recognition-primed decisions In W B Rouse(Ed) Advances in man-machine system research (Vol 5 pp47ndash92) Greenwich CT JAI

Kramer R M (1999) Trust and distrust in organizations Emerg-ing perspectives enduring questions Annual Review of Psy-chology 50 569ndash598

Kramer R M Brewer M B amp Hanna B A (1996) Collectivetrust and collective action The decision to trust as a socialdecision In R M Kramer amp T R Tyler (Eds) Trust in organi-zations Frontiers of theory and research (pp 357ndash389)Thousand Oaks CA Sage

Kramer R M amp Tyler T R (Eds) (1996) Trust in organizationsFrontiers of theory and research Thousand Oaks CA Sage

Lee J D Gore B F amp Campbell J L (1999) Display alterna-tives for in-vehicle warning and sign information Messagestyle location and modality Transportation Human Factors 1347ndash377

Lee J D amp Moray N (1992) Trust control strategies and alloca-tion of function in human-machine systems Ergonomics 351243ndash1270

Lee J D amp Moray N (1994) Trust self-confidence and opera-torsrsquo adaptation to automation International Journal ofHuman-Computer Studies 40 153ndash184

Lee J D amp Sanquist T F (1996) Maritime automation In RParasuraman amp M Mouloua (Eds) Automation and humanperformance (pp 365ndash384) Mahwah NJ Erlbaum

Lee J D amp Sanquist T F (2000) Augmenting the operator func-tion model with cognitive operations Assessing the cognitivedemands of technological innovation in ship navigation IEEETransactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 273ndash285

Lee M K O amp Turban E (2001) A trust model for consumerInternet shopping International Journal of Electronic Com-merce 6(1) 75ndash91

Lewandowsky S Mundy M amp Tan G (2000) The dynamics oftrust Comparing humans to automation Journal of Experi-mental Psychology ndash Applied 6 104ndash123

Lewicki R J amp Bunker B B (1996) Developing and maintain-ing trust in work relationships In R M Kramer amp T R Tyler(Eds) Trust in organizations Frontiers of theory and research(pp 114-139) Thousand Oaks CA Sage

Lewis M (1998) Designing for human-agent interaction AIMagazine 19(2) 67ndash78

Lichtenstein S Fischhoff B amp Phillips L D (1982) Calibrationof probabilities The state of the art to 1980 In D KahnemanP Slovic amp A Tversky (Eds) Judgment under uncertaintyHeuristics and biases (pp 306ndash334) New York CambridgeUniversity Press

Liu Y Fuld R amp Wickens C D (1993) Monitoring behavior inmanual and automated scheduling systems InternationalJournal of Man-Machine Studies 39 1015ndash1029

Loewenstein G F Weber E U Hsee C K amp Welch N (2001)Risk as feelings Psychological Bulletin 127 267ndash286

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 30: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

TRUST IN AUTOMATION 79

MacMillan J Entin E B amp Serfaty D (1994) Operator relianceon automated support for target acquisition In Proceedings ofthe Human Factors and Ergonomics Society 38th AnnualMeeting (pp 1285ndash1289) Santa Monica CA Human Factorsand Ergonomics Society

Malhotra D amp Murnighan J K (2002) The effects of contractson interpersonal trust Administrative Science Quarterly 47534ndash559

Masalonis A J (2000) Effects of situation-specific reliability ontrust and usage of automated decision aids Unpublished PhDdissertation Catholic University of America Washington DC

Mayer R C Davis J H amp Schoorman F D (1995) An integra-tive model of organizational trust Academy of ManagementReview 20 709ndash734

McKnight D H Cummings L L amp Chervany N L (1998)Initial trust formation in new organizational relationshipsAcademy of Management Review 23 473ndash490

Meyer J (2001) Effects of warning validity and proximity onresponses to warnings Human Factors 43 563ndash572

Meyerson D Weick K E amp Kramer R M (1996) Swift trustand temporary groups In R M Kramer amp T R Tyler (Eds)Trust in organizations Frontiers of theory and research (pp166ndash195) Thousand Oaks CA Sage

Milewski A E amp Lewis S H (1997) Delegating to softwareagents International Journal of Human-Computer Studies 46485ndash500

Miller C A (2002) Definitions and dimensions of etiquette In CMiller (Ed) Etiquette for human-computer work (Tech ReportFS-02-02 pp 1ndash7) Menlo Park CA American Association forArtificial Intelligence

Mishra A K (1996) Organizational response to crisis In R MKramer amp T R Tyler (Eds) Trust in organizations Frontiers oftheory and research (pp 261ndash287) Thousand Oaks CA Sage

Molloy R amp Parasuraman R (1996) Monitoring an automatedsystem for a single failure Vigilance and task complexityeffects Human Factors 38 311ndash322

Montgomery D A amp Sorkin R D (1996) Observer sensitivityto element reliability in a multielement visual display HumanFactors 38 484ndash494

Moorman C Deshpande R amp Zaltman G (1993) Factors affect-ing trust in market-research relationships Journal of Marketing57(1) 81ndash101

Moray N (2003) Monitoring complacency scepticism and eutacticbehaviour International Journal of Industrial Ergonomics 31175ndash178

Moray N amp Inagaki T (1999) Laboratory studies of trust be-tween humans and machines in automated systems Transactionsof the Institute of Measurement and Control 21 203ndash211

Moray N Inagaki T amp Itoh M (2000) Adaptive automationtrust and self-confidence in fault management of time-criticaltasks Journal of Experimental Psychology ndash Applied 6 44ndash58

Morgan R M amp Hunt S D (1994) The commitment-trust theoryof relationship marketing Journal of Marketing 58(3) 20ndash38

Mosier K L Skitka L J Heers S amp Burdick M (1998)Automation bias Decision making and performance in high-tech cockpits International Journal of Aviation Psychology 847ndash63

Mosier K L Skitka L J amp Korte K J (1994) Cognitive andsocial issues in flight crewautomation interaction In MMouloua amp R Parasuraman (Eds) Human performance inautomated systems Current research and trends (pp 191ndash197)Hillsdale NJ Erlbaum

Muir B M (1987) Trust between humans and machines and thedesign of decision aides International Journal of Man-Machine Studies 27 527ndash539

Muir B M (1989) Operatorsrsquo trust in and use of automatic con-trollers in a supervisory process control task UnpublishedPhD dissertation University of Toronto Toronto Canada

Muir B M (1994) Trust in automation 1 Theoretical issues inthe study of trust and human intervention in automated sys-tems Ergonomics 37 1905ndash1922

Muir B M amp Moray N (1996) Trust in automation 2 Experi-mental studies of trust and human intervention in a processcontrol simulation Ergonomics 39 429ndash460

Muller G (1996) Secure communication ndash Trust in technology ortrust with technology Interdisciplinary Science Reviews 21336ndash347

Nass C amp Lee K N (2001) Does computer-synthesized speechmanifest personality Experimental tests of recognition similarity-attraction and consistency-attraction Journal ofExperimental Psychology ndash Applied 7 171ndash181

Nass C amp Moon Y (2000) Machines and mindlessness Socialresponses to computers Journal of Social Issues 56 81ndash103

Nass C Moon Y Fogg B J Reeves B amp Dryer D C (1995)Can computer personalities be human personalities Interna-tional Journal of Human-Computer Studies 43 223ndash239

National Transportation Safety Board (1997) Marine accidentreport ndash Grounding of the Panamanian passenger ship RoyalMajesty on Rose and Crown Shoal near Nantucket Massa-chusetts June 10 1995 (NTSBMAR9701) Washington DCAuthor

Norman D A Ortony A amp Russell D M (2003) Affect andmachine design Lessons for the development of autonomousmachines IBM Systems Journal 42 38ndash44

Nyhan R C (2000) Changing the paradigm ndash Trust and its role inpublic sector organizations American Review of Public Admini-stration 30 87ndash109

Ockerman J J (1999) Over-reliance issues with task-guidancesystems In Proceedings of the Human Factors and ErgonomicsSociety 43rd Annual Meeting (pp 1192ndash1196) Santa MonicaCA Human Factors and Ergonomics Society

Ostrom E (1998) A behavioral approach to the rational choicetheory of collective action American Political Science Review92 1ndash22

Pally R (1998) Emotional processing The mind-body connectionInternational Journal of Psycho-Analysis 79 349ndash362

Panksepp J amp Bernatzky G (2002) Emotional sounds and thebrain The neuro-affective foundations of musical appreciationBehavioural Processes 60 133ndash155

Parasuraman R Molloy R amp Singh I (1993) Performance con-sequences of automation-induced ldquocomplacencyrdquo InternationalJournal of Aviation Psychology 3 1ndash23

Parasuraman R Mouloua M amp Molloy R (1994) Monitoringautomation failures in human-machine systems In M Moulouaamp R Parasuraman (Eds) Human performance in automatedsystems Current research and trends (pp 45ndash49) HillsdaleNJ Erlbaum

Parasuraman R Mouloua M amp Molloy R (1996) Effects ofadaptive task allocation on monitoring of automated systemsHuman Factors 38 665ndash679

Parasuraman R amp Riley V (1997) Humans and automationUse misuse disuse abuse Human Factors 39 230ndash253

Parasuraman R Sheridan T B amp Wickens C D (2000) A modelfor types and levels of human interaction with automationIEEE Transactions on Systems Man and Cybernetics ndash Part ASystems and Humans 30 286ndash297

Parasuraman S Singh I L Molloy R amp Parasuraman R (1992)Automation-related complacency A source of vulnerability incontemporary organizations IFIP Transactions A ndash ComputerScience and Technology 13 426ndash432

Picard R W (1997) Affective computing Cambridge MA MITPress

Rasmussen J (1983) Skills rules and knowledge Signals signsand symbols and other distinctions in human performancemodels IEEE Transactions on Systems Man and CyberneticsSMC-13 257ndash266

Rasmussen J Pejterson A M amp Goodstein L P (1994) Cog-nitive systems engineering New York Wiley

Rasmussen J amp Vicente K J (1989) Coping with human errorsthrough system design Implications for ecological interfacedesign International Journal of Man-Machine Studies 31517ndash534

Reeves B amp Nass C (1996) The media equation How peopletreat computers television and new media like real people andplaces New York Cambridge University Press

Rempel J K Holmes J G amp Zanna M P (1985) Trust in closerelationships Journal of Personality and Social Psychology49(1) 95ndash112

Riegelsberger J amp Sasse M A (2002) Face it ndash Photos donrsquotmake a Web site trustworthy In CHI 2002 Conference onHuman Factors in Computing Systems Extended Abstracts (pp742ndash743) New York Association for Computing Machinery

Riley V (1996) Operator reliance on automation Theory anddata In R Parasuraman amp M Mouloua (Eds) Automationtheory and applications (pp 19ndash35) Mahwah NJ Erlbaum

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003

Page 31: Trust in Automation: Designing for Appropriate Reliancecsl/publications/pdf/leesee04.pdfcombination of many modes, and reliance is of-tena more graded process. Automation reliance

80 Spring 2004 ndash Human Factors

Riley V A (1994) Human use of automation Unpublished PhDdissertation University of Minnesota Minneapolis MN

Rilling J K Gutman D A Zeh T R Pagnoni G Berns G Samp Kilts C D (2002) A neural basis for social cooperationNeuron 35 395ndash405

Ring P S amp Vandeven A H (1992) Structuring cooperativerelationships between organizations Strategic ManagementJournal 13 483ndash498

Ross W amp LaCroix J (1996) Multiple meanings of trust in nego-tiation theory and research A literature review and integrativemodel International Journal of Conflict Management 7314ndash360

Rotter J B (1967) A new scale for the measurement of interper-sonal trust Journal of Personality 35 651ndash665

Rotter J B (1971) Generalized expectancies for interpersonaltrust American Psychologist 26 443ndash452

Rotter J B (1980) Interpersonal trust trustworthiness and gulli-bility American Psychologist 35 1ndash7

Rousseau D Sitkin S Burt R amp Camerer C (1998) Not sodifferent after all A cross-discipline view of trust Academy ofManagement Review 23 393ndash404

Sato Y (1999) Trust and communication Sociological Theoryand Methods 13 155ndash168

Schindler P L amp Thomas C C (1993) The structure of inter-personal trust in the workplace Psychological Reports 73563ndash573

Schwarz N (2000) Emotion cognition and decision makingCognition and Emotion 14 433ndash440

Shaw R amp Kinsella-Shaw J (1988) Ecological mechanics Aphysical geometry for intentional constraints Human Move-ment Science 7 155ndash200

Sheridan T B (1975) Considerations in modeling the humansupervisory controller In Proceedings of the IFAC 6th WorldCongress (pp 1-6) Laxenburg Austria International Federationof Automatic Control

Sheridan T B (1992) Telerobotics automation and humansupervisory control Cambridge MA MIT Press

Sheridan T B amp Hennessy R T (1984) Research and modelingof supervisory control behavior Washington DC NationalAcademy

Simon H (1957) Models of man New York WileySimon H (1967) Motivational and emotional controls of cogni-

tion Psychological Review 74 29ndash39Singh I L Molloy R amp Parasuraman R (1993) Individual dif-

ferences in monitoring failures of automation Journal ofGeneral Psychology 120 357ndash373

Sitkin S B amp Roth N L (1993) Explaining the limited effec-tiveness of legalistic ldquoremediesrdquo for trustdistrust OrganizationScience 4 367ndash392

Sloman S A (1996) The empirical case for two systems of rea-soning Psychological Bulletin 119 3ndash22

Slovic P Finucane M L Peters E amp McGregor D G (inpress) Risk as analysis and risk as feelings Some thoughtsabout affect reason risk and rationality Risk Analysis

Sorkin R D amp Woods D D (1985) Systems with human moni-tors A signal detection analysis Human-Computer Interaction1 49ndash75

Sparaco P (1995 January 30) Airbus seeks to keep pilot newtechnology in harmony Aviation Week and Space Technologypp 62ndash63

Stack L (1978) Trust In H London amp J E Exner Jr (Eds)Dimensions of personality (pp 561ndash599) New York Wiley

Tan H H amp Tan C S (2000) Toward the differentiation of trustin supervisor and trust in organization Genetic Social andGeneral Psychological Monographs 126 241ndash260

Tenney Y J Rogers W H amp Pew R W (1998) Pilot opinionson cockpit automation issues International Journal of AviationPsychology 8 103ndash120

Thackray R I amp Touchstone R M (1989) Detection efficiencyon an air traffic control monitoring task with and without com-puter aiding Aviation Space and Environmental Medicine60 744ndash748

Thelen E Schoner G Scheier C amp Smith L B (2001) Thedynamics of embodiment A field theory of infant perservativereaching Behavior and Brain Sciences 24 1ndash86

Trentesaux D Moray N amp Tahon C (1998) Integration of thehuman operator into responsive discrete production manage-ment systems European Journal of Operational Research 109342ndash361

Tseng S amp Fogg B J (1999) Credibility and computing technol-ogy Communications of the ACM 42(5) 39ndash44

van Gelder T amp Port R E (1995) Itrsquos about time An overviewof the dynamical approach to cognition In R F Port amp T vanGelder (Eds) Mind as motion (pp 1ndash43) Cambridge MAMIT Press

Van House N A Butler M H amp Schiff L R (1998) Cooperatingknowledge work and trust Sharing environmental planningdata sets In Proceedings of the ACM Conference on ComputerSupported Cooperative Work (pp 335ndash343) New York Asso-ciation for Computing Machinery

Vicente K J (1999) Cognitive work analysis Towards safe pro-ductive and healthy computer-based work Mahwah NJErlbaum

Vicente K J (2002) Ecological interface design Progress andchallenges Human Factors 44 62ndash78

Vicente K J amp Rasmussen J (1992) Ecological interface designTheoretical foundations IEEE Transactions on Systems Manand Cybernetics SMC-22 589ndash606

Wickens C D Gempler K amp Morphew M E (2000) Workloadand reliability of predictor displays in aircraft traffic avoidanceTransportation Human Factors 2 99ndash126

Wicks A C Berman S L amp Jones T M (1999) The structureof optimal trust Moral and strategic Academy of ManagementReview 24 99ndash116

Williamson O (1993) Calculativeness trust and economic orga-nization Journal of Law and Economics 36 453ndash486

Wright G Rowe G Bolger F amp Gammack J (1994) Co-herence calibration and expertise in judgmental probabilityforecasting Organizational Behavior and Human DecisionProcesses 57 1ndash25

Yamagishi T amp Yamagishi M (1994) Trust and commitment inthe United States and Japan Motivation and Emotion 18129ndash166

Yeh M amp Wickens C D (2001) Display signaling in augmentedreality Effects of cue reliability and image realism on attentionallocation and trust calibration Human Factors 43 355ndash365

Zheng J Bos N Olson J S amp Olson G M (2001) Trust with-out touch Jump-start trust with social chat In CHI 2001Conference on Human Factors in Computing SystemsExtended Abstracts (pp 293ndash294) New York Association forComputing Machinery

Zuboff S (1988) In the age of smart machines The future ofwork technology and power New York Basic Books

John D Lee is an associate professor of industrialengineering and director of the Cognitive SystemsLaboratory at the University of Iowa He receivedhis PhD in mechanical engineering in 1992 fromthe University of Illinois at Urbana-Champaign

Katrina A See is a human-systems analyst withAptima in Woburn Massachusetts She received anMS in industrial engineering with a focus on trustin automation from the University of Iowa in 2002

Date received August 5 2002Date accepted September 4 2003