-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
1/300
School of Computer Science
Carnegie Mellon University
Pittsburgh, PA 15213-3890
Thesis Committee:
Joseph Bates, ChairJaime Carbonell
Reid Simmons
Aaron Sloman, University of Birmingham, England
Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy.
Believable Social and Emotional Agents
W. Scott Neal Reilly
May 1996
CMU-CS-96-138
This research was partially supported by Fujitsu Laboratories, Mitsubishi Electric Research Labs,
and Justsystem Corporation. The views and conclusions contained in this document are those of
the author and should not be interpreted as representing the official policies, either expressed or
implied, of Fujitsu Laboratories, Mitsubishi Electric Research Labs, or Justsystem Corporation.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
2/300
Keywords: Artificial Intelligence, Interactive Art, Interactive Drama, Interactive Entertainment,
Interactive Fiction, Believable Agents, Emotional Agents, Social Agents, Interactive Characters,
Oz, Tok, Em, Agent Architectures, Autonomous Agents, Social Behaviors, Agent Modeling, Per-
sonality
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
3/300
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
4/300
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS ii
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
5/300
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS iii
AbstractOne of the key steps in creating quality interactive drama is the ability to create
quality interactive characters (or believable agents). Two important aspects of
such characters will be that they appear emotional and that they can engage in
social interactions. My basic approach to these problems has been to use a broad
agent architecture and minimal amounts of modeling of other agent in the envi-
ronment. This approach is based on an understanding of the artistic nature of the
problem.
To enable agent-builders (artists) to create emotional agents, I provide a general
framework for building emotional agents, default emotion-processing rules, and
discussion about how to create quality, emotional characters. My framework gets
a lot of its power from being part of a broad agent architecture. The concept is
simple: the agent will be emotionally richer if there are more things to have
emotions about and more ways to express them. This reliance on breadth has
also meant that I have been able to create simple emotion models that rely on
perception and motivation instead of deep modeling of other agents and complex
cognitive processing.
To enable agent builders to create social behaviors for believable agents, I have
designed a methodology that provides heuristics for incorporating personality
into social behaviors and suggests how to model other agents in the environment.
I propose an approach to modeling other agents that calls for limiting the amount
of modeling of other agents to that which is sufficient to create the desired be-
havior. Using this technique, I have been able to build robust social behaviors
that use surprisingly little representation. I have used this methodology to build a
number of social behaviors, like negotiation and making friends.
I have built three simulations containing seven agents to drive and test this work.I have also conducted user studies to demonstrate that these agents appear to be
emotional and can engage in non-trivial social interactions while also being good
characters with distinct personalities.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
6/300
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS iv
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
7/300
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS v
AcknowledgementsI would like the thank my advisors, Joe and Jaime. When you get to CMU as a
first year graduate student, youre told that there are basically two kinds of advi-
sors: the younger advisor with a lot of time and enthusiasmmaybe too much
time and enthusiasmand the more experienced advisor that you dont see
much but that is tremendously helpful when you do catch him or her in the coun-
try. I got the best of both worlds. Joes vision and enthusism were contagious
from the start and Jaime has been able to see to the heart of what Ive been doing
and make useful suggestions better than anyone I have yet to meet outside the Oz
group.
I would like the thank my other committee members, Reid and Aaron, who have
helped make my work more rigorous and scientific when I would have preferred
to take an easier path.
I would like to thank Sara Kiesler of CMUs Department of Social and Decision
Science who helped me put together the experiments that I describe.
I would like to thank all the members of the Oz Project that have made this work
so much fun, including Bryan, Peter, Phoebe, Mark, Matt and, of course, Joe.
I would like to thank all of the people who have made my stay at CMU exciting,
stimulating, and a whole lot of fun, including Sharon, Catherine, and a group of
friends that I would love to list, but Im too afraid of leaving someone off. Be-
sides, this thesis is long enough as it is.
I would like to thank Phoebe, LeAnn, and Bob who read this monster and helped
make it much more readable and comprehensible. You should probably thank
them too.
Finally, I would like to thank my parents for supporting me and believing in me
all these years. I would also like to thank LeAnn who gives meaning to my life
and to my work.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
8/300
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS vi
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
9/300
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS vii
Table of Contents
Chapter 1. Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Imagine ......................................................................................................1
1.2 Structure and Contributions of the Thesis .....................................................3
1.3 Interactive Drama and the Oz Project............................................................4
1.4 Believable Agents ..........................................................................................7
1.5 The Relationship between Emotions and Social Behavior..........................17
1.6 Simulation Systems......................................................................................18
1.7 Summary......................................................................................................22
Part I. Believable Emotional Agents . . . . . . . . . . .23
Chapter 2. Believable Emotional Agents. . . . . . . . . . . . . . . . . . . . . . . . . . . 25
2.1 Introduction to the Problem.........................................................................25
2.2 Foundation ...................................................................................................25
2.3 Contribution: Tools for Building Emotional Agents ...................................31
2.4 Key Idea: Broad Emotional Agents .............................................................38
2.5 Summary......................................................................................................39
Chapter 3. Emotion Generation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
3.1 Key Ideas in This Approach ........................................................................42
3.2 The Tools: The Em Architecture .................................................................44
3.3 The Tools: Ems Default Emotion Generation Rules..................................52
3.4 Discussion: Types of Emotion Generation Rules ........................................69
3.5 Summary......................................................................................................71
Chapter 4. Emotion Storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
4.1 Storage .........................................................................................................73
4.2 Combination.................................................................................................77
4.3 Decay ...........................................................................................................78
4.4 Querying ......................................................................................................80
4.5 How to Create a New Emotion Type...........................................................81
4.6 Summary......................................................................................................83
Chapter 5. Expressing Emotions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
5.1 Behavioral Features .....................................................................................86
5.2 Emotional Expression in Em .....................................................................103
5.3 Summary....................................................................................................114
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
10/300
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS viii
Chapter 6. Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
6.1 Emotion Claims .........................................................................................117
6.2 Validation of the Em System.....................................................................118
6.3 Testing the Internals of Em........................................................................1326.4 Summary....................................................................................................136
Part II. Believable Social Agents . . . . . . . . . . . . . 139
Chapter 7. Believable Social Agents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141
7.1 Introduction and Overview of the Problem ...............................................141
7.2 The Goals...................................................................................................142
7.3 Methodology for Building Social Behaviors .............................................145
7.4 Related Work .............................................................................................150
7.5 Summary....................................................................................................153
Chapter 8. Understanding the Methodology I: Negotiation. . . . . . . . . . . 155
8.1 Motivation for Building Negotiation .........................................................156
8.2 Traces of Negotiating Believable Agents ..................................................156
8.3 Creating a Behavior with Personality ........................................................159
8.4 The Role of Representation in Negotiation ...............................................186
8.5 Summary....................................................................................................188
Chapter 9. Understanding the Methodology II: Making Friends . . . . . . . 191
9.1 Motivation for Building a Making-Friends Behavior................................192
9.2 Ways to Make Friends ...............................................................................192
9.3 Helping Others Achieve Their Goals.........................................................1939.4 Making Friends by Modifying Other Behaviors........................................202
9.5 Summary....................................................................................................203
Chapter 10. Validation of the Methodology. . . . . . . . . . . . . . . . . . . . . . . . 205
10.1 Experimental Methodology .....................................................................205
10.2 Experimental Results ...............................................................................205
10.3 Summary..................................................................................................220
Part III. Summary & Future Work. . . . . . . . . . . . . 221
Chapter 11. Summary & Future Directions . . . . . . . . . . . . . . . . . . . . . . . . 22311.1 Summary: Believable Emotional Agents.................................................223
11.2 Summary: Believable Social Agents .......................................................226
11.3 Future Directions .....................................................................................229
11.4 The Art of Building Social and Emotional Believable Agents................238
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
11/300
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS ix
Part IV. Appendices . . . . . . . . . . . . . . . . . . . . . . .239
Appendix A. Traces From the Simulation Systems . . . . . . . . . . . . . . . . . 241
A.1 Robbery World..........................................................................................241A.2 Office Politics ...........................................................................................246
A.3 The Playground.........................................................................................258
Appendix B. User Studies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265
B.1 Study 1: Evaluating the Characters on The Playground ...........................265
B.2 Study 2: Evaluating the Gunmans Emotions ...........................................276
Bibliography ..........................................................281
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
12/300
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS x
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
13/300
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 1
CHAPTER 1 Introduction
1.1 Imagine
Imagine you could enter into the world of Indiana Jonesthat you could be the
world-famous archaeologist looking for lost treasure in exotic parts of the world,
meeting interesting people and clashing with treacherous villains.
Imagine you could play the part of Hercule Poirot or Miss Marple trying to solve
an intriguing and dangerous murder mystery.
Imagine you could be Sir Galahad on the quest for the Holy Grail. Or a hard-
ened police sergeant trying to rescue a hostage. Or a space explorer meeting
new intelligent civilizations.
Why cant you do these things? These are the kinds of things that many comput-
er and video games claim to offer but all of them seem to fall short of their prom-
ises.
The problem, in part, seems to be that computer and video games have attempted
to succeed purely on their interactive nature. Interactivity is certainly a powerfultool (see Sloan [Sloan91] and Kelso et al. [Kelso92]), but is it enough?
I believe the answer is no.
What is missing in current computer and video games? It seems to be the same
two things that have made good novels stand out from bad novels, good movies
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
14/300
Introduction
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 2
from bad movies, and good drama from bad drama. These elements areplotand
character.
The Oz Project at Carnegie Mellon has been developing technology that we hopewill make it possible for artists to create simulated worlds that contain rich char-
acters and that give a human interactor the feeling of being an important part of
an interesting story. We hope the interactor can suspend his or her disbelief and
become deeply engaged by the experience. This happens when people become
deeply engaged by a movie or novel and we hope to be able to achieve a similar
experience. In fact, we hope that the experience could be even more intense than
that provided by a good movie because it is interactive. We call such experiences
interactive drama.
There are (at least) two main problems in creating such a system. The first is how
to create simulated worlds where the user has the feeling of freedom but wherethe user also has some artist-shaped experience. This problem is being studied by
Wehyrauch [Wehyrauch96] and will not be central to the work described here.
The second main problem with creating interactive story worlds is how to build
characters for such worlds that are as rich as characters in other media (e.g.,
movies, novels) while also being interactive. These interactive characters are
also called believable agents.
Artificial Intelligence (AI) has developed a number of tools for building interac-
tive agents that provide a good starting point, but they are not sufficient. For ex-
ample, AI has not traditionally been concerned with creating agents that areemotional. Traditional AI has also focused on multi-agent interactions for the
purposes of problem solving but has not investigated how to create agents that
interact with each other while displaying distinctive personalities. I believe that
being able to create artistically defined characters that display emotions and en-
gage in personality-based social interactions will be as critical to this new artistic
medium as it is to traditional, non-interactive, artistic media.
The goal of building believable agents is inherently an artistic one. Traditional
AI goals of creating competence and building models of human cognition are
only tangentially related because creating believability is not the same as creat-
ing intelligence or realism. Therefore, the tools that have been designed for those
tasks are not appropriate. I will return to this point in section 1.4.1.
My approach to the problems of creating believable agents that are social and
emotional is to create a new set of tools and methodologies that are suitable for
the artistic nature of these problems. These tools and methodologies are a first
step towards enabling artists to create interactive story systems with quality
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
15/300
Structure and Contributions of the Thesis
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 3
interactive characters that can display emotions and engage in believable social
behaviors.
1.2 Structure and Contributions of the Thesis
This introductory chapter will provide an overview of interactive drama, interac-
tive characters (or believable agents), and the relationship between the emotional
and social aspects of agents. Because this is a new and rather unusual kind of
problem (at least as far as traditional artificial intelligence is concerned), the
background and motivation provided in the introduction are crucial to under-
stand what follows.
After I have provided this important background material, the rest of the thesis is
broken into three parts. Part I deals with the creation of believable emotional
agents; Part II deals with the creating of believable social agents; Part III pro-
vides a summary of the main contributions of the thesis and some speculation
about future directions for the research.
Part I of the thesis will focus on creating believable emotional agents. The major
contributions of this part of the thesis include:
A set of tools for creating believable emotional agents that includes:
a framework for building believable emotional agents,
a default set of emotional processes to provide reasonable default
emotional behavior, and discussions about how to create specific emotional characters within
this framework.
A methodology for creating emotions within a broad set of capabilities
that allows artists to create emotionally rich characters. This methodolo-
gy also enabled me to create models of how to generate emotions that
rely on perception and motivation as well as cognition. These models can
be simpler and faster than purely cognitive models
Validation that the tools I have built can be used to create characters that
users find to be both emotional and believable.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
16/300
Introduction
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 4
Part II of this thesis will focus on creating believable social agents. The major
contributions of this part of the thesis include:
Finally, in Part III, I will summarize the contributions of the thesis and speculate
about possible future directions for this work.
Now, I begin with an introduction to interactive drama, which will provide the
context and motivation for the work described in the rest of the thesis.
1.3 Interactive Drama and the Oz Project
The dream of interactive drama isnt necessarily a unified one, though I dont
mean that in a derogatory sense. I mean it in the same sense that fine arts or com-
puter science or any other large field isnt coherent. Interactive drama is just an
umbrella phrase for a set of rather different kinds of things.
Interactive drama might have a single human interactor (often called the user
or the player) or many interactors. The story might be created from a pre-de-
fined set of possible user choices or it might emerge from a (possibly guided
[Wehrauch96]) complex simulation. The user might see the world through
text, animation, or a full-blown virtual-reality interface. The user might act in
A two-part methodology for creating believable social behaviors for spe-cific characters.
Part 1 of the methodology suggests a number of important elements
of personality that should be incorporated into social behaviors in or-
der to make them personality-rich.
Part 2 of the methodology prescribes using a minimal amount of rep-
resentation for modeling other agents in the environment.
A set of believable social behaviors that provide:
case studies for explaining the methodology in depth and how to ap-
ply it in practice,
evidence for the breadth of behaviors the methodology can be used to
create, and
examples of social behaviors for specific characters that use small
amounts of representation of other agents.
Validation that users find social characters build using this methodology
can be good characters.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
17/300
Interactive Drama and the Oz Project
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 5
the drama by typing commands, moving a mouse, or through speech and ges-
tures.
And these are only a few of the possibilities.
One type of interactive drama is interactive fiction, which involves a text-based
interface where the user types commands and is provided with text descriptions
of the world. The plots emerge from the simulation, which is (traditionally) a
simple physical environment with mostly physically based puzzles.1 Solving one
puzzle allows the user to get to the next puzzle, which provides direction to the
plot. Characters are very simple or non-existent. One popular example from this
genre is Zork [Blank80].
Another type of interactive drama is the interactive play, which is (typically) a
traditional play with a number of choice-points (usually one) that allow the audi-ence to pick one of a few paths. All of the possible plots are determined ahead of
time. The characters are the actors and do not directly interact with the audience.
For example, near the end of The Murder of Edwin Drood [Holmes86] there is
an intermission at which point the audience gets to decide who the murderer will
be; the cast acts out a different (but pre-scripted) ending based on that choice.
1.3.1 The Oz Project
The work I will describe is being done in the context of the Oz system, which is
a set of tools for creating certain kinds of interactive drama. Although the Oz no-
tion of interactive drama has its boundaries, I still believe that the ideas about in-
teractive plot and characters that have developed within the Oz system will beapplicable to a fair range of interactive drama systems [Bates92c].
Figure 1-1 shows the Oz architecture for an interactive drama system. There is a
physical-world simulation that includes some number of characters. One of the
characters is, or is controlled by, a human user. How the user sees the world
and acts in the world are not strictly defined. There is a drama manager that is re-
sponsible for subtly controlling the users experience towards some author-de-
fined end. The drama manager can manipulate the physical world, the
autonomous characters, and the user interface in order to create a dramatic expe-
rience for the user.
1. This is not completely accurate as some works that call themselves interactive fiction are hypertextbased. For example, choose-your-own-adventure books are a kind of interactive fiction where the playeracts by picking from a list of pages to turn to that represent different actions.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
18/300
Introduction
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 6
FIGURE 1-1 The Oz Interactive Drama Architecture.
Classic interactive fiction, like Zork, fits this model, though the drama manager
slot is empty and the characters (other than the user) are few and simple. Interac-
tive cinema would not fit this model because it is based on a pre-defined branch-
ing structure instead of on a physical simulation with autonomous characters.
Choose-your-own adventure stories do not fit this model for the same reason.
Within this model we1 have built two very different kinds of interactive drama
systems. The first system is a text-based system that is similar to interactive-fic-
tion games in feel; the user types commands and is given text descriptions of theworld. We have built a number of worlds in this style, the first of which was an
apartment with a house cat, Lyotard, as its sole occupant (see [Bates92a] &
[Bates92b]). The users goal was to explore the apartment and make friends with
1. We, in this chapter, will refer to the members of the Oz group.
Physical World Simulation
Actions
Sense Data
Human User
Interface
Computer-controlled Characters
Drama Manager
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
19/300
Believable Agents
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 7
the cat, though there was no drama manager to push this story. The drama man-
ager is ongoing research [Wehyrauch96] and was not used in any of the systems
that I will describe in this thesis. Figure 1-2 provides a simple trace from a user
interacting with this world.
The second type of system we have built is a graphical world with animated
creatures. The user controls a creature in the world with a mouse. The only world
we have built in this style so far is called the Edge of Intention or The Wog-
gles. This world contains three ellipsoidal creatures called woggles, each with a
unique personality. The user controls the motions of a fourth woggle. Again,
there is no drama manager in this world, so the story is generated purely from
the interactions of the characters. Figure 1-3 shows the woggles posing for the
camera.
1.4 Believable Agents
An important part of building good interactive drama is building good interactive
characters. Not all good stories will require characters, but certainly many will.
Much of the work on the Oz system has gone into building tools that will allow
artists to create personality-rich, interactive characters.
This section presents the Oz approach to building believable agents in a top-
down manner. I start with an overview of the nature of the problem. Understand-
ing the problem clearly is crucial to understanding my proposed solutions. Then
I explain the broad but shallow methodology for building believable agents. In
section 1.4.4, I describe the Tok agent architecture; Tok is a broad architecturespecifically developed for building believable agents. The section concludes
with an brief overview of the Hap language, which is the programming language
used to build the various components of the Tok architecture. Understanding the
details of the language are important only insofar as they help the reader under-
stand how I implemented the systems I will describe later in the thesis.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
20/300
Introduction
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 8
FIGURE 1-2 An Interaction with Lyotard. This is part of a larger trace and has been edited forbrevity. The user types commands at the PLAYER> prompt. All other text is computergenerated, so it may not be very interesting but is consistent with the underlyingphysical simulation. Consistency is a problem in many other interactive fiction systemsthat rely on canned text.
You are in the dining room.To the south, you see the sunroom.To the east, you see the kitchen.
The end table and the small chair are in the dining room.The jar is on the end table.The nine black sardines are in the jar.
PLAYER> take the jar
You take the jar.
PLAYER> go southYou are in the sunroom.Lyotard goes to the sunroom.
PLAYER> give a sardine to lyotardYou offer the black sardine to Lyotard.Lyotard runs to the dining room.
PLAYER> follow lyotardYou run to the dining room.Lyotard looks around nervously.
PLAYER> pet lyotardYou pet Lyotard.Lyotard bites you.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
21/300
Believable Agents
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 9
FIGURE 1-3 The Woggles. The stripes on the bodies are due to the worlds lighting.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
22/300
Introduction
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 10
1.4.1 The Nature of the Goal
The problem of creating believable agents lies somewhere between the arts and
artificial intelligence (AI).
Artists know how to create believable characters, although they do not have
provably correct methods for doing so. If they did, there wouldnt ever be bad
characters. Artists do, however, have a sense for what works and what doesnt. In
order to create believable agents, we need to understand what artists know.
AI researchers know how to create autonomous agents . There are a number of AI
architectures that can be used to create goal-driven, reactive, robust, autonomous
agents. In order to create believable agents, we need to know what AI research-
ers know.
The goal of the Oz Project is to join these two disciplines to produce autono-mous, interactive agents that have the qualities that have made the non-interac-
tive characters of traditional media believable. This is what we mean by
believable agents.
The Oz approach to creating believable agents (and my approach to creating be-
lievable emotions and social behaviors) is to start with the artistic nature of the
goal and work backwards to the tools that are appropriate to such a goal. In other
words, instead of bringing to the task our non-artistic notions of what the goal
should be, we have read and studied the arts and listened to artists. Instead of
starting with AI tools designed for other tasks, we have built tools specifically to
support this artistic task.
The term believable is a specific term from the arts to describe characters that
work. Believable characters are characters that seem to be alive and that an au-
dience has emotions for or about. Believable does not mean honest, convincing,
or realistic. It gets at something else, something artistic. For the remainder of this
section, I will discuss three lessons from the arts about the fundamental nature of
believability that I feel are critical to understanding the (sometimes unusual) di-
rections of the Oz work and, more specifically, my work.
1. Believable agents may not be intelligent.
Much AI research is devoted to creating intelligent agents that can performdifficult tasks quickly and efficiently. In our domain, intelligence often takes a
back seat to other concerns. It will even occasionally be desirable to have stu-
pid characters (e.g., Forrest Gump or Woody on Cheers).
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
23/300
Believable Agents
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 11
Lesson: We dont want agent architectures that enforce rationality and intel-
ligence. AI systems designed for these goals would be inappropriate for build-
ing believable agents.
2. Believable agents may not be realistic.
Some branches of AI try to mimic nature, either in order to understand
humans better or to produce efficient architectures. Cognitive modeling and
work on lifelike agents (e.g., ALIVE [Maes95]) are examples of this kind
of approach. Unrealistic characters, however, can be very believable. For
instance, animated characters are far from being realistic, but often make very
believable characters. In many animated films, the best characters are often
talking animals, furniture, and other non-realistic sorts of things. Bugs Bunny
and many of the characters in Disneys Beauty and the Beast are good
examples.
In fact, it turns out that sometimes being more realistic can decrease believ-
ability. For example, watching extremely realistic animation of human faces,
like that of Terzopoulus [Terzopoulus95], can be somewhat disturbing, where-
as watching unrealistic animation, like Charlie Brown, can be very satisfying.
The reason for this is that the state of the art in computer animation can make
a mostly realistic human face, but not a completely realistic one; this close-
but-not-quite face is very disturbing to watch because people are so well
adapted to watching human faces. From the standpoint of believability, it is
better to go with the less realistic characters which meet the audiences expec-
tations than to go with the more realistic characters which dont.
Another key reason to avoid realism is that powerful artistic techniques, likeabstraction/simplification and exaggeration, rely on altering reality for more
effective characters. The idea behind these techniques is that the artist (or ac-
tor) is attempting to communicate the essential personality of the character to
the audience. Because of this, artistic characters rarely (never?) have person-
alities as complex as humans, and the important traits they do have are often
exaggerated for emphasis. Felix and Oscar in Neil Simons The Odd Cou-
ple [Simon66] are good examples of simplified, exaggerated characters.
Lesson: We dont want architectures that enforce realism. This means that
we dont want systems that only generate externally realistic behavior. It also
means that we are willing to use unrealistic internal processingthe goal isnot to create cognitively plausible agents; it is to create good characters.
3. Believable agents will have strong personalities.
Most AI researchers would be happy with personality-deficient agents that
could competently perform useful tasks. However, one of the strengths of tra-
ditional characters is their interesting personalities. A goal of our work is to
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
24/300
Introduction
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 12
be able to create a variety of different agents, each with a distinct and interest-
ing personality. These personalities should affect everything about the agent,
including how the agent moves, thinks, and talks. Also, idiosyncratic quirks
are extremely important parts of the agents personality. Traditional AI isntparticularly interested in either artistic personalities or variation across agents.
Lesson: Personality should permeate the architecture and should not be con-
strained more than is absolutely necessary.
In our quest for believability, the Oz Project has adopted an approach that we
think best allows us to achieve the kinds of characters we want. We call this the
broad but shallow approach to building agents.
1.4.2 Broad Agents
Members of the Oz project (see, for example, [Bates91] and [Reilly94]) havepreviously argued that the way to create believable agents is to use broad agent
architectures. Such architectures have three important properties: they have a
broad set of capabilities, each capability is typically (but not necessarily) some-
what shallow, and all of the capabilities are tightly integrated. I will discuss each
of these properties in turn and describe why we feel each is important for our pri-
mary goal of creating believable agents.
Using a Broad Set of Capabilities
The focus of much AI research has been on creating systems that do a small
number of things particularly well, such as language understanding, problem
solving, or learning (for an exception, see [Sloman94]). Even potentially broadarchitectures like SOAR [Laird87] were used for many years very narrowly
usually doing only a few things in any one program. (Recently, however, SOAR
projects like TacAir SOAR have started to use more breadth [Tambe95].)
Believable agents need to be capable of behaving like interesting characters in a
simulated environment. Such agents need to appear to have goals and emotions
and they need to interact naturally and reasonably with their environment. They
need to have enough capabilities to handle the variety of situations they are
likely to encounter in an environment containing a human user. If the characters
dont have a broad set of capabilities, they will likely break the users suspension
of disbelief. Such agents will need a broad set of capabilities, like perception,language understanding, language generation, emotions, goal-driven behavior,
reactivity to the environment, memory, inference, social skills, and possibly
others.
Some of these demands are simply a product of the agent needing to act within a
complex, dynamic simulated world, but many are suggested to us by artists in
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
25/300
Believable Agents
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 13
other media. For instance, Thomas and Johnston [Thomas81], who are old-time
Disney animators, tell us about the importance of emotion, perception, the ap-
pearance of thought, and interpersonal interactions for bringing animated charac-
ters to life.
Shallowness and the Suspension of Disbelief
We generally think, however, that we will only need to model the set of capabili-
ties shallowly. Users of Oz worlds, like movie-goers and novel readers, will typ-
ically want to suspend their disbelief. This is similar to what happens with users
of Weizenbaums ELIZA system [Weizenbaum66] who buy into it despite its ex-
treme shallowness. We expect users of interactive drama systems will want to
suspend their disbelief just like moviegoers and novel readers do all the time. If
this is true, we hope that as long as our characters dont act significantly out of
character, users will tend to find them believableoften even coming up withreasons and excuses for unusual behavior.
Kelso et. al provide some experimental evidence to support this claim [Kelso92].
Users in their experiments were placed in situations similar to those that users
would be in if they were interacting with an Oz-like system, except that the other
characters and the director were humans instead of being computer-controlled. In
this experiment, the users reported very intense experiences even though exter-
nal observers found there to be numerous problems with both the story and the
characters.
Were not claiming that depth of competence is necessarily bad or that it will
never be necessary; were only claiming that a broad set of shallow capabilitieswill be sufficient for creating many believable agents and is probably a more
fruitful approach than the more typical narrow-and-deep methodology. A broad-
and-deep approach would probably be more generally useful, but it is currently
too difficult. We also dont believe that depth is necessary for many artistic
agents.
Integration of Capabilities
One of the key elements of this broad-and-shallow approach is the integration of
the agents various capabilities. Our experience has been that a large set of shal-
low capabilities becomes much more powerful when those capabilities are tight-
ly integrated. By integrating capabilities, it is possible to create synergistic
effects that make the result much more powerful than individual capabilities
alone.
For example, lets look at the interactions between two capabilities: natural lan-
guage and emotion. It isnt hard to imagine building characters that lack one or
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
26/300
Introduction
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 14
the other. In some cases that would even be desirable. Lets imagine, though, that
thats not what we want, so we put in an emotion system and a natural language
system and go to the effort of integrating them. Now we can have emotions that
are based on language, such as being angry at a verbal insult (language under-standing) or being frustrated at not being able to think of an appropriate word
(language generation). Our agent is now also capable of talking about its emo-
tions. Emotions might also affect how the agent speaks, such as stuttering when
the agent is nervous. All of these important effects arise only once we have inte-
grated emotions and language skills in our agent.
1.4.3 The Tok Agent Architecture
The members of the Oz Project designed and built the Tok agent architecture
(see [Bates92a] and [Bates92b]) in an effort to provide the kind of integrated
breadth necessary to build artistic characters for interactive drama. Figure 1-4provides a high-level view of the Tok architecture. I have made no attempt to de-
scribe the interactions between the various components of the architecture here,
although I will discuss many of these interactions throughout the thesis.
FIGURE 1-4 The Tok agent architecture
Simulated Physical World
Action
NL Understanding
NL Generation
Social Behavior &KnowledgeEmotion
Body State &Facial Expression
Sensing &Perception
Sense Data ActionsBody & FaceChanges
Inference Memory Others...
Tok Agent
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
27/300
Believable Agents
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 15
One way we achieve the tight integration of all of these components is by writing
the code for all of the various subsystems in a common behavior-based program-
ming language, Hap. In the next section, I will describe Hap in more detail.
1.4.4 The Hap Language
In order to build broad, believable agents, we needed a suitable language. In our
case, we use the Hap language of Loyall and Bates [Loyall96]. The Hap lan-
guage has been designed specifically to support believable agents, although it is
similar to other work such as reactive architectures (e.g., [Firby89] and
[Georgeff87]), behavior-based architectures (e.g., [Brooks86]), and situated ac-
tion (e.g., [Agre90] and [Suchman88]).Throughout the thesis, I will point out
various ways that I have extended the basic Hap language to better suit the task
of building social and emotional agents.
It is not necessary to understand Hap very deeply in order to understand what
follows. For the purposes of discussion in this thesis, the most important things
to understand about Hap are the following:
1. Hap maintains a dynamic forest of active behaviors, with each behavior at-
tempting to perform some set or sequence of actions. Those actions can be ei-
ther external or internal (within the mind). The actions can also include
subgoals, which lead to other behaviors.
For example, an agent might have goals to eat when hungry, sleep when tired,
and play otherwise. Each goal can lead to a number of different behaviors.
One eat behavior might be to find a restaurant and eat there; both of these
steps are subgoals that lead to other behaviors. The find-restaurant behavior
might have a mental step that stores the location of the restaurant for future
use.
2. All behaviors are pre-coded in a production memory. Hap does no planning in
the traditional sense.
For example, in the eat behavior, the behavior to find a restaurant is pre-coded
by the artist. There is no on-the-fly planning to achieve this goal, though the
choice of behaviors to accomplish goals depends on external perception and
internal state.
3. Goal success is not necessarily a testable property of the world. That is, it maynot be possible to write an expression that represents the state in which a goal
has been achieved. Instead, some goals are purely behavioral in that they suc-
ceed when some set or sequence of actions has been performed.
For example, one behavior the agent might use for the playing goal would be
to throw a ball up in the air and catch it. The goal state and initial state are
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
28/300
Introduction
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 16
similar. The point of the goal is the execution of some action, not the achieve-
ment of some particular state of the world.
4. Hap supports the creation of reactive behaviors. Here are some of the ways
that this is accomplished:
Demons can create new goals. For example, the goal to sleep is created
when the agent gets tired.
Goals can be interrupted and resumed. For example, the goal to find a res-
taurant is temporarily suspended to dodge out of the way of an on-coming
car and then resumed.
Preconditions on behaviors make sure that they are chosen only when ap-
propriate. For example, the behavior to play with a ball depends on having
a ball. If the agent doesnt have a ball, another behavior will have to be
chosen or the goal will fail.
Goals can succeed serendipitously by means of success-tests that mark the
goal as successful even if the associated behavior has not been accom-
plished. For example, if the agents goal is to get to a restaurant and the
agent starts off towards a known restaurant, the agent might come across a
new restaurant along the way that will fulfill the goal even though the cho-
sen behavior is not completed.
Behaviors can be rejected when the context of the agent changesthis is
done by means of context-conditions that mark behaviors as having failed
when the context is no longer appropriate. For example, the playing with a
ball behavior depends on the agent having a ball. If another agent should
take the ball away in the middle of the behavior, the behavior ends.
5. Hap allows multiple threads of processing. This way agents can have multiple
goals being processed together. The highest priority goal will be processed
until an action has been chosen for that goal or processing is otherwise halted.
Then the next highest priority goal begins processing. This ends when all
goals have been processed or when the time allotted to choose actions has
expired.1
6. Hap is a general programming language, so many types of behaviors can be
written using it, from physical behaviors to natural language behaviors to
emotion-based behaviors2.
1. There are two versions of Hap. One allows multiple threads; the other does not. Much of my work relatesto both versions, though not all. Where it is relevant, I will point out what work is unique to one version ofHap or the other.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
29/300
The Relationship between Emotions and Social Behavior
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 17
1.5 The Relationship between Emotions and Social Behavior
Now that I have presented an overview of the Oz system and believable agents, I
will motivate my choice of focusing on the social and emotional aspects of be-lievable agents.
This thesis may, at first, seem a bit disjointas though it were really just two
small theses bound together: one about emotions for agents and one about social
behaviors. As I described in the previous section, however, there is an advantage
to breadth and these two aspects of agents seem particularly well suited for each
other.
There are (at least) two ways that emotions and social skills combine to make
Tok agents more believable. First, the combination allows for emotions that are
greater in number and variety. Second, the relationships that agents have witheach other are more believable because of this integration of capabilities. I will
expand on these ideas in turn.
First, social factors are very important in determining emotions. Many causes of
emotion in people and in artistic characters arise from social factors. For
instance, anger, love, hate, jealousy, and grief are often associated with other
agents in the world. Without social knowledge and relationships with other
agents, there would be a large gap in the types of emotions our agents could
express.
Relationships also affect an agents emotions in other ways, such as modifying
what emotions are felt and how strong they are when the cause of the emotion is
another agent. For example, if Sue hears Bob insulting Fred, she might have very
different emotional reactions based on her relationships with Bob and Fred. If
she is friends with Fred, she might feel intensely angry at Bob. If she likes Bob
and not Fred, she might find Bob somewhat amusing instead.
Second, the relationships between agents are more believable because of the in-
tegration of emotions and interpersonal relationships. This is because the dy-
namics of social relationships often depend on emotions. If Bill is regularly
mean to John, it is likely that John will get angry at Bill and eventually learn to
dislike Bill. If John couldnt feel anger, he would probably continue to associate
with Bill, which seems less believable. Tok agents could enter into social rela-
2. Emotion-based behaviors are things like generating fear when an important goal of the agent is threat-ened. Other kinds of behaviors (such as physical behaviors) written in Hap can also take emotional informa-tion into account in a variety of ways, such as walking across a room angrily. This will be expanded on inChapter 5.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
30/300
Introduction
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 18
tionships with other agents without having emotions, but by integrating these
two capabilities these agents have much richer relationships that can change over
time based on emotional factors.
1.6 Simulation Systems
Before I get on with the rest of the thesis, I will briefly introduce three simulation
systems that I designed and built as part of this research. I used these three sys-
tems to help test and focus my research. As I will describe, each of the simula-
tions I have built has been designed to push specific areas of this work.
All of the systems I will describe here are text-based, mostly because I didnt
want to spend too much time on unrelated difficulties dealing with animation,
which is not one of my areas of expertise.1 The Oz system also had some tools
for doing text-based speech interactions, whereas there were no methods for do-ing either real or simulated speech in the animated systems. By working in the
text-based system I was able to explore more complex social behaviors, like ne-
gotiation, than if I had used the animation-based system.
The three simulations I built are called Robbery World, Office Politics, and
The Playground. Each consists of a simple physical world with a few distinct
locations and a few characters. The typical interaction runs roughly 10 to 20
minutes. I think of them as interactive versions of short stories or animated
shorts. They are not especially large but are still complex enough to be
interesting.
The text descriptions of the environment and of events in the world are all com-
puter generated. This often makes them a bit stilted and, occasionally, ungram-
matical. In most of the traces I will present in the thesis, I will edit them to make
them more readable. This is the case with the traces presented here. Unedited
traces can be found in Appendix A.
It is important to point out that the natural language capabilities of these agents
are much more limited than they might appear. The language understanding
systems are based on keyword matching and the language generation is done by
templates. For the worlds I have created, these simple mechanisms have proved
mostly adequate. More importantly, though, they have allowed me to pursuework in emotion and social behavior without having to first solve the difficult
problems involved in natural language generation and understanding. Loyalls
1. As described in section 1.3, the Oz system has been used to develop both text-based and animation-basedinteractive systems. The work described in this thesis has been incorporated into both types of systems.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
31/300
Simulation Systems
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 19
work [Loyall96] should prove to be a much better solution for believable
language generation than the simple mechanisms I have used.
Robbery World was designed to push the ability of my tools to modelcharacters emotions. The user plays the part of a police officer attempting to
thwart a convenience-store holdup. The other characters are a gunman and a
cashier. Figure 1-5 shows part of a sample trace from an interaction with this
simulation. For more traces, see Appendix A. As noted before, the text output of
this system is all computer generated, so it tends to be somewhat stilted.
Hopefully, the emotions and social interactions of the characters will be evident
despite the interface.
FIGURE 1-5 An interaction with Robbery World
You are in the parking lot.To the north, you see the convenience store.The cashier and the gunman are in the convenience store.The gunman is holding his gun.The gunman is wearing the ski mask.The cashier is holding the bag.
The gunman is now red.The gunman is now scowling.The gunman is now tense.The gunman says to you Back off and nobody gets hurt!.The cashier says to the gunman Please dont kill me. Ill
give you whatever you want..
PLAYER> Gunman: Come out of there now, you low-life scum!The gunman says to you Hey, shut your trap! I aint listen-ing to none of your crap!.
PLAYER> Gunman: Come out now or Im coming in!The gunman says to the cashier Come on! Hurry up!.
PLAYER> go northThe gunman aims his gun at you.
PLAYER> shoot the gunmanThe gunman is now wounded.The gunman is now pale.The gunman is now bug-eyed.The gunman is now trembling.The gunman says to you OK. I give up. Im turning myselfin..
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
32/300
Introduction
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 20
Office Politics was designed to test some social aspects of the characters, such
as how relationships between characters might affect their behaviors. It also pro-
vides characters that engage in simple social behaviors, like helping and deceiv-
ing each other. The user is a project manager attempting to send off an emailproject proposal by 5 pm but the email server is broken. The other characters in-
clude Mary, the users boss; Gus, the technician; and Sarah, another manager.
This scenario was inspired by Cesta and Micelis work [Cesta93]. They use a
similar scenario to demonstrate their approach to agents modeling other agents.
Since my approach to this problem is quite different, this system provides a point
of comparison to other work in this area. Figure 1-6 shows part of a trace from
this simulation.
FIGURE 1-6 An Interaction with Office Politics
You are in Guss office.The desk, the chair and Gus are in Guss office.Gus works.
[4:35] PLAYER> Gus: Can you help me fix the email server?Gus says Sorry, bud. Id like to help, but Im swamped to-day..
[4:36] PLAYER> go northYou are in the eastern hallway.To the west, you see the western hallway.
[4:37] PLAYER> go westYou are in the western hallway.To the north, you see Marys office.
[4:38] PLAYER> go northYou are in Marys office.The desk, the chair and Mary are in Marys office.Mary works.
[4:39] PLAYER> Mary: I have to get a project proposal out andthe server is broken.Mary says I dont know how to fix the server. Why dont you
ask Gus?.
[4:40] PLAYER> Mary: I did, but Gus is too busy to help.Mary says Alright, Ill talk to Gus, but I really donthave time for this..
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
33/300
Simulation Systems
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 21
The Playground was also designed to test the social side of the agents. In this
simulation, the user is a school child trading baseball cards during recess. The
other characters are Sluggo, the bully, and Melvin, the nerd. As will be argued in
Chapter 10, this simulation shows that I can create believable characters that en-gage in reasonably complex social behaviors like negotiation and initiating rela-
tionships with others. A partial trace from this simulation can be found in Figure
1-7.
FIGURE 1-7 An Interaction with The Playground
PLAYER> get in the sandboxMelvin is now smiling.Melvin says Greetings, earthling..
PLAYER> Melvin: Wanna trade?Melvin says The Klingon Emperor wants to know if youwould be willing to part with Babe Ruth for Reggie Jack-son?.
PLAYER> Melvin: No thanks.Melvin says Captain Picard would make the trade..
PLAYER> Melvin: Sorry, no deal.Melvin says Are there any cards we (the Klingon high com-mand and I) can offer you in return for Babe Ruth?.
PLAYER> Melvin: Well, do you have anything that would helpme get Mays from Sluggo?Melvin says So the Vulcan ambassador wishes to get WillieMays from Darth Sluggo. He really likes Mickey Mantle youknow and Id trade you..
PLAYER> Melvin: Okay, how about Mantle for Ruth?Melvin says The Klingon high command accepts Vulcansgenerous offer..
[Time passes. The trade is finalized and the user approach-es Sluggo.]
PLAYER> Sluggo: Do you want to trade with me?Sluggo says Make an offer, butthead..
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
34/300
Introduction
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 22
1.7 Summary
Here are some of the important issues that have come up in this chapter.
I introduced the idea of interactive drama, where a human user gets to play thepart of a character in a interactive story-based simulation. I also described the
Oz system and its model of interactive drama.
I introduced the problem of enabling artists to build interactive characters for
interactive drama systems. These interactive characters are called believable
agents. Because this is inherently an artistic problem, I described some of the
lessons we can learn from the arts about how to make good characters: they
dont need to be intelligent; they dont need to be realistic; they should have
distinctive personalities. All of these affect the kinds of approaches that are
appropriate for solving the problem.
I described the Oz approach to building agents with broad sets of shallow buttightly integrated capabilities and argued that this is a reasonable approach to
building believable agents. I also described the Tok agent architecture and the
Hap language that is used to write most of the components of the Tok
architecture.
I argued that emotions and social behaviors are distinct but closely related
parts of the agent architecture. By studying both I am able to make both sys-
tems richer and more interesting.
I described three simulated systems and seven believable agents that I have
built using the techniques that will be described in the thesis. These three sys-
tems,Robbery World, Office Politics, and The Playground, were used to moti-
vate and test much of the research described in the thesis.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
35/300
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 23
Part I:
Believable Emotional Agents
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
36/300
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 24
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
37/300
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 25
CHAPTER 2 Believable Emotional Agents
2.1 Introduction to the Problem
Characters in non-interactive media, like novels and movies, are often emotion-
al. In fact, artists tell us that emotions are critical to the believability of their
characters. Frank Thomas and Ollie Johnston, two of Disneys original anima-
tors, wrote a book called The Illusion of Life about creating believable animated
characters; here are some of the things they have to say [Thomas81]:
From the very beginning, it was obvious that these feelings of the characters
would be the heart and soul of Disney pictures. (p.473)
From the earliest days, it has been the portrayal of emotions that has given the
Disney characters the illusion of life. (p.505)
The overriding goal for this part of the thesis is to enable artists to create interac-
tive versions of believable emotional characters like the ones that Thomas and
Johnston talk about. This is what I mean by believable emotional agents.
2.2 Foundation
My primary goal is to enable artists to create believable emotional agents. In or-
der to accomplish this goal, I drew on previous work from a range of sources, in-
cluding art, psychology, and AI. The arts helped me understand the problem
better and psychology and AI provided some insights into how I might solve the
problem.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
38/300
Believable Emotional Agents
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 26
2.2.1 Art & Entertainment
It is artists who best know how to create believable characters and how to imbue
them with emotions, so it is fitting to turn to the arts for guidance in building in-
teractive believable characters.
The first contribution of artists is that they identify emotion as an important
problem for building believable agents. For instance, the excerpts from Thomas
and Johnston at the beginning of the chapter indicate how important they feel
emotions are for creating quality characters.
Artists also provide ideas about how to create effective emotions for characters.
These ideas are not formal, so they cannot be directly implemented, but they
have helped me make a number of important design decisions that I will discuss
in more detail later.
One important idea about how to create effective emotional agents is that the
emotions should be specific to the character. In other words, each character
needs to be unique and its emotions need to fit its particular personality. Again,
here are some excerpts from The Illusion of Life [Thomas81]:
These characters showed hatred and scorn in their own way, but in a convinc-
ing manner. They were equally entertaining, but they were in no way inter-
changeable, which points up the importance of the storymans knowing his
characters. (p. 483)
[I]t is the animator who must think deeply into the personality of the car-
toon actors. Each must be handled differently, because each will express his
emotions in his own way. (p. 487)
These quotations refer to the expression of emotion, but it is also important for
the characters to have individual emotional reactions to situations as well. For in-
stance, in Disneys Snow White, each of the seven dwarves might feel very dif-
ferently about a single event because of his distinctive personality. And, as the
quotations above state, even when characters have similar responses, they need
to express those reactions individually.
Another important idea from the arts is that the characters emotions need to be
expressed broadly. That is, emotions must affect everything about the character:
the way it moves, the way it talks, the expression on its face. An underlying as-sumption here is that the purpose of a good character is to clearly communicate
its thoughts, feelings, and personality to the audience (or, in the case of interac-
tive characters, the user). By expressing emotion in only the characters face, for
example, artists find it harder to communicate than if the whole character is used
to express the emotion. Thomas and Johnston have this to say on the subject
[Thomas81]:
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
39/300
Foundation
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 27
If a scene calls for showing tense emotions such as anguish, scorn, bitterness,
or envy with only facial expression, the animator will be quite limited. But if
the story is built so that the character reveals these feelings in what he does and
how he does it... the scenes can be gripping and entertaining. (p. 482)
The expression must be captured throughout the whole body as well as in the
face. (p. 443) [Emphasis in the original.]
One producer at Disneys insisted that if a character said he felt a certain way,
that was all that was needed.... But it does not work like that. It is not enough
simply to proclaim that a character is mad or worried or impatient. There must
be business to support the statement and a situation in which he can demon-
strate these emotions if the audience is to be convinced that it is so. (p. 387)
Finally, the arts remind us that the goal is believable emotions, not realistic emo-
tions. Artists will often want to create characters that are exaggerated or larger
than life, which is at odds with achieving realism. Also, animated characters
can be believable, even though they are clearly unrealistic. Some characters may
seem quite realistic; others will be wildly unrealistic, but they can all be believ-
able in the artistic sense of the wordI want to enable artists to create whichev-
er they want. My experience has been that this is the hardest of the artistic
principles to graspeven Walt Disney had trouble expressing it. Again, from
[Thomas81]:
There was some confusion among the animators when Walt first asked for
more realism then criticized the result because it was not exaggerated enough...
When Walt asked for realism, he wanted a caricature of realism. One artist an-
alyzed it correctly when he said, I dont think he meant realism. I think he
meant something that was more convincing, that made a bigger contact with
people, and he just said realism because real things do... (p. 66)
Thomas and Johnston, however, are not unclear on the issue [Thomas81]:
It should be believable, but not realistic.... Tell your story through the broad
cartoon characters rather than the straight ones. There is no way to animate
strong-enough attitudes, feelings, or expressions on realistic characters to get
the communication you should have. The more real, the less latitude for clear
communication. (p. 375, emphasis added)
Although I have focused primarily on the animation work reported in Thomas
and Johnston, these ideas are not particular to them or to animation. Some of
these ideas can be traced as far back as Aristotles Poetics [Aristotle87b] and art-ists in other media (such as screenplays [Horton94], novels [Gardner91], and
even comic books [McCloud91]) make similar claims.
To summarize, the four important lessons to draw from the arts are:
Emotions are important for creating believable characters.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
40/300
Believable Emotional Agents
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 28
Emotions need to be specific to the character in question.
Emotions need to be expressed broadly.
Emotions must be believable but may not always be realistic.
2.2.2 Psychology & Cognitive AI
The arts provide insights into the problem of creating believable emotional
agents, but they do not provide computational ideas for how to build autonomous
agents with emotions. Psychology and cognitive AI provide some ideas about
how to approach this problem.
Once again, as the goal of this work is based in the arts, not in psychology, I am
not particularly interested in using cognitively plausible models of emotion. In
fact, as indicated in the previous section, artists may well want characters withemotions that are cognitively unrealistic, but nonetheless appropriate for the
characters.
This means that the emotion theories I draw on do not have to be correct to suit
my needs. They only need to be able to help artists build believable emotional
characters. I chose as a basis for my work the emotion theories of Ortony, Clore
and Collins (OCC) [Ortony88] and Gilboa and Ortony [Elliott92]1. The first de-
scribes when people are emotional and the second describes how people express
emotions.
One reason for choosing these models is that they were designed to be imple-
mented computationally. Other researchers (e.g., [Elliott92] and [Warner91])
have also implemented versions of these models.
Another reason for adopting these models is that they are reasonably simple to
understand. Because I eventually want artists to use these models and it is likely
that these artists will not have much formal psychology training, I wanted to
keep the models as simple as possible. Decisions about what is simple are
clearly subjective and it is possible that my tools are harder to use than if I had
chosen some other models. I will describe my decisions and how I made them in
this thesis but I leave it as future work to determine if other models are more use-
ful and easily understood by artists.
1. Gilboa and Ortony never published this theory, though it is described in [Elliott92].
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
41/300
Foundation
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 29
I will, however, briefly describe two of the models that I did not choose as a basis
for my work: the basic emotion model of Oatley [Oatley92] and the emergent
emotion model of Sloman [Sloman86].
Oatley hypothesizes five basic emotions that are the foundation of all emotional
experience: joy, distress, fear, anger, and disgust. All other emotions are hypoth-
esized to be related to these emotions in various (and, it appears to me, unspeci-
fied) ways. Although it is possible that this model has some psychological basis
in reality, because the model doesnt make explicit how specific, non-basic emo-
tions relate to the basic five, I felt it would make it difficult for the artist who
wanted to create characters with non-basic emotions like hope or jealousy.
Sloman (like Simon [Simon67]) hypothesizes that emotions are emergent prop-
erties of complex, resource-limited, motivation-processing systems. In this mod-
el, emotions are states of the overall system and there is no separate emotioncomponent. In other words, mechanisms of the mind designed to deal with the
difficulties of a complex environment can be in perturbance states where the
agents high-level cognitive processes are partially out of control. Sloman calls
these perturbance states emotional. (Note that he uses the word emotional,
but he does not use the word emotion because of the confusion surrounding
the definition of that word.) Beaudoin [Beaudoin94] has also explored this model
in depth and built a simulated environment to test and demonstrate some of these
ideas. Again, this may be sound psychology, but, as I will discuss, I have chosen
to create an explicit emotion system to give artists more direct control over the
emotions of their characters.
It may be that as artists want more and more complex characters and as psycho-
logical models become more and more accurate and powerful, a new set of tools
will need to be built that rely on more cognitively plausible models. The impor-
tant thing to remember is that the goal is an artistic one and using cognitively
plausible models is only appropriate if it helps achieve this goal.
2.2.3 Story-Based AI
Other AI researchers who have influenced my work come from the area of story-
based AI. I have found useful ideas and inspiration in the work of Meehan
[Meehan76], Carbonell [Carbonell79], Dyer [Dyer83], and Lebowitz
[Lebowitz84,Lebowitz85]. Each has provided some insight into the problem of
building believable emotional characters.
Dyers BORIS system understands stories about emotional episodes, such as di-
vorces. BORIS does not generate stories, nor is it interactive, so on the surface it
may not appear especially similar. However, in order to store the emotional con-
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
42/300
Believable Emotional Agents
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 30
tent of stories, Dyer created an AFFECT structure that I modified and used as a
means of storing the emotional experiences of characters.
Lebowitz and Meehan developed story-generation systems that include charac-ters. Lebowitzs UNIVERSE creates soap operas and Meehans TALE-SPIN cre-
ates fables. Both systems adopt ways of representing relationships and feelings
about other characters that are similar to the attitude system that I have adopted.
The attitudes that characters have towards each other in UNIVERSE are de-
scribed along four dimensions (like-dislike, attractedness, dominant-submissive,
and intimate-distant). Meehans characters use the following scales: affection,
competition, deception, trust, domination, familiarity, indebtedness.
These systems relate to my work in two important ways, as I shall describe: they
support my decision to study the emotional and social aspects of characters to-
gether and they provide good examples of the importance of providing the artistwith as many choices as possible.
The first idea is fundamental to the dual nature of this thesis; I feel that the emo-
tional and social aspects of characters are closely enough related that to do ei-
ther one well, it is necessary to do both. This is the argument I presented in
section 1.5. The fact that both Lebowitz and Meehan were able to use a single
system to handle emotions and social relationships of agents gives additional
support to the interconnection of these two facets of agents.
The second idea, thatI want to supply choices to the artist, is based on an obser-
vation about the choices that Lebowitz and Meehan made. Both devised a systemto model the relationships and emotions that characters can have about each oth-
er. The systems are quite different from each other and Lebowitzs model is dif-
ferent from the psychological model that he drew on [Wish76]. Furthermore, the
attitude system that the OCC model [Ortony88] proposes has only like and dis-
like attitudes. So, here are four different models, all of which are useful, though
limited. And if I were to choose the superset of all of these models, the chances
are good that I would still miss attitudes that artists would want to use. I didnt
want to build in assumptions (like these other systems did) about things like
what attitudes characters can have about each other for fear of stifling the cre-
ativity of artists. The contrast of the different attitude systems of Lebowitz, Mee-
han, Wish, and OCC with my more general solution to this problem, provides anexample my overall approach of providing artists with freedom and flexibility
whenever possible.
Continuing with the look at story-based AI, Carbonell focused more on personal-
ity than emotions in his work. His analysis of different personality types suggests
how various traits can be expressed through plans, goals, and reactions to failure.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
43/300
Contribution: Tools for Building Emotional Agents
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 31
(The goal of this analysis was to understand stories better.) This analysis, howev-
er, proves useful for understanding many of the ways emotions can be expressed.
For example, one of the reaction-to-failure traits is depressed. A depressed
agent, according to this theory, will often respond to failures by being more like-ly to abandon the plan and goal. In Chapter 5, I will discuss a number of ways to
express emotions, one of which is making the current plan and goal more likely
to be abandoned.
2.3 Contribution: Tools for Building Emotional Agents
In this section I will introduce a set of tools, collectively called Em, that support
artists in the creation of believable emotional agents. In section 2.4, I will discuss
how working within a broad agent architecture has helped me create these tools.
Using the foundational work described in the previous section, I created a num-
ber of tools that support the creation of believable emotional agents: a frame-
work (or architecture) for building emotional agents, a specific system built
within this architecture that provides reasonable default emotional processing,
and discussions about how to use the first two tools to create specific believable
emotional agents.
Before I describe these tools, it is important to recall that the goal is not to create
cognitively plausible emotional agents. The architecture allows many unrealistic
(but possibly interesting) agents to be built and the default emotional processing,
though informed by the psychology literature, has been tailored to meet a specif-
ic artistic end.
2.3.1 The Em Emotion Architecture
The first tool that I provide artists is an emotion architecture1 that sits within a
larger agent architecture. The emotion architecture determines the boundaries of
what is and is not possible for the agent builder to create in terms of emotional
agents. For example, the architecture determines what inputs are available to the
agent builder for determining which emotions the agent will have. If the agent
builder didnt have access to the agents goals, it would be impossible to create
emotions based on those goals.
1. The definition ofemotion architecture and many of the other terms I will use can be found in Figures 2-2 and 2-3. (Additional terms will be introduced in future chapters.) I have tried to avoid using the genericterm emotion because of the confusion it can cause. Read and Sloman [Read93] have previously discussedsome of the terminological perils associated with working in the area of emotion research. To reiterate thenote in the two figures, much of the terminology I use is specific to my work. The underlying assumptionsbehind this terminology will be made clear in the next few chapters.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
44/300
Believable Emotional Agents
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 32
Figure 2-1 provides a look at the bare-bones version of the Em emotion architec-
ture. I describe the details of the Em architecture in Chapters 3 through 5, but I
begin here with a brief overview of the whole architecture.
The first thing to notice in Figure 2-1 are the Inputs from Tok. These inputs are
used to decide when an agent should react emotionally. The Em architecture pro-
vides a wide range of inputs to the artist-defined rules that determine what emo-
tions the agent will display.
The Emotion Generators box represents this set of rules (called emotion gener-
ators) that take the set of inputs and produce a set ofEmotion Structures. These
rules are written in the Hap language. An example emotion generation rule is the
following: when an agent has a goal failure and the goal has importance X, gen-
erate an emotion structure of type distress and with intensity X. Emotion struc-
tures have a type (e.g., fear), an intensity (e.g., 7 out of 10), possibly a direction(e.g., Sluggo), and a cause (e.g., Sluggo is threatening to beat me up). Details of
the inputs to the emotion architecture and the emotion generators will be provid-
ed in Chapter 3.
A set ofEmotion Storage Functions takes the emotion structures as they are
created and puts them into an Emotion Type Hierarchy. Emotion structures are
placed in this hierarchy based on what kinds of effects they will have, with high-
er-level nodes representing more general effects and lower-level nodes repre-
senting more specific effects. For example, the hierarchy might have a distress
type that represents general forms of distress expression, like frowning, crying,
and moving slowly. Below that type might be subtypes, such as grief, homesick-ness, and lovesickness, which inherit the general effects of their common parent,
but that also have more specific means of expression as well, such as thinking
about home when homesick.
Each type in the hierarchy (e.g., distress) has an intensity associated with it that
is a function on the intensities of the emotion structures of that type. The way
that the intensities of the emotion structures are combined is determined by a set
ofEmotion Combination Functions.
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
45/300
Contribution: Tools for Building Emotional Agents
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 33
FIGURE 2-1 The Em Architecture
Emotion Generators
Emotion Structures
Behavioral Feature
Map
Behavioral Features
Inputs from Tok(Including Em)
Outputs to Tok(Including Em)
Emotion Type
Emotion Storage
Functions
Combinationand
DecayFunctions Hierarchy
-
8/4/2019 ReillyWSN Believable Social and Emotional Agents PhD CMU CS 96 138
46/300
Believable Emotional Agents
BELIEVABLE SOCIAL AND EMOTIONAL AGENTS 34
The intensity of the emotion structures will decay over time at a rate specified by
the artist in the Emotion Decay Functions. Each structure can have its own de-
cay function if desired (e.g., anger from being insulted decays slower than other
emotion structures), or decay functions can be defined at the emotion-type level(e.g., all anger emotion structures decay slowly) or over all emotions (i.e., all
emotions decay at the same rate). Details about storing, combining, and decaying
emotions will be provided in Chapter 4.
The emotion structures are mapped into Behavioral Features via a Behavioral
Feature Map. This arbitrary mapping is written in Hap. It is the behavioral fea-
tures, and not the emotion structures, that directly affect behavior.
The final component of the Em architecture are the Outputs to Tok. The behav-
ioral features are able to affect a number of different aspects of the agents pro-
cessing. More details about the behavioral features and the effects on the agentwill be provided in Chapter 5.
2.3.2 The Default Em Emotion System
The Em architecture provides the structure that an artist will work within when
creating emotional characters, but none of the content. And because the
architecture is so flexible for artistic reasons, it can be hard to know how to
begin. For instance, artists have a large amount of flexibility in determining how
to map inputs to emotion structures, but coming up with a good mapping is still a
hard problem. Similarly, determining