chimpbinder - colorado.edu · 1997; redican, 1982; parr, et al., 2005), ... (wallbott and...
TRANSCRIPT
Understanding chimpanzee facial expression:
insights into the evolution of communication
Lisa A. Parr1 and Bridget M. Waller21Division of Psychiatry and Behavioral Sciences & Yerkes National Primate Research Center, Emory University, Atlanta, GA, USA and2Department of Psychology, University of Portsmouth, Portsmouth UK
To understand the evolution of emotional communication, comparative research on facial expression similarities between
humans and related species is essential. Chimpanzees display a complex, flexible facial expression repertoire with many physical
and functional similarities to humans. This paper reviews what is known about these facial expression repertoires, discusses the
importance of social organization in understanding the meaning of different expressions, and introduces a new coding system,
the ChimpFACS, and describes how it can be used to determine homologies between human and chimpanzee facial expressions.
Finally, it reviews previous studies on the categorization of facial expressions by chimpanzees using computerized tasks, and
discusses the importance of configural processing for this skill in both humans and chimpanzees. Future directions for
understanding the evolution of emotional communication will include studies on the social function of facial expressions in
ongoing social interactions, the development of facial expression communication and more studies that examine the perception
of these important social signals.
Keywords: facial communication; primates; face processing; comparative research
NATURAL ETHOLOGY OF FACIAL EXPRESSION: FORM
AND FUNCTION
Many animal species communicate using a variety of highly
conspicuous signals, including acoustic, tactile, olfactory and
visual displays that have been tuned by natural selection to
impact the listener in a reliable way (Smith, 1977; Dawkins
and Krebs, 1978). Among primates, the visual and auditory
domains have become the two most prominently involved in
social communications. There is, for example, a general
assumption that facial expressions convey a variety of
information about an individual’s motivation, intentions
and emotions (van Hooff, 1967; Ekman, 1997; Parr, et al.,
2002). As such, facial expressions are critically important for
coordinating social interaction, facilitating group cohesion
and maintaining individual social relationships (Parr et al.,
2002). Despite the importance of facial expressions in the
evolution of complex societies, there is little work comparing
either the form or function of facial expression across
distinct phylogenetic groups. Moreover, apart from a
handful of excellent ethograms that describe the commu-
nication repertoires for a variety of species, including
chimpanzees, bonobos, rhesus monkeys, capuchin monkeys
and canids (Hinde and Rowell, 1962; van Hooff, 1962, 1967,
1973; Andrew, 1963; Goodall, 1968; Fox, 1969; Bolwig, 1978;
Weigel, 1979; de Waal, 1988; Preuschoft and van Hooff,
1997; Redican, 1982; Parr, et al., 2005), there has been little
attempt to standardize the description of their facial and
vocal displays in a manner that facilitates comparative and
evolutionary studies.
Fridlund (1994) has proposed that facial expressions are
best understood as communicative signals and as such,
researchers should focus on their functional consequences
during social interaction, or how they impact the listener.
Moreover, the only way to fully understand why facial
expressions have evolved to convey a specific meaning is to
compare similar facial expressions between evolutionarily
related species, examining any factors that may have
influenced their social function, including ecological pres-
sures and social factors like dominance style and social
organization (Preuschoft and van Hooff, 1997). Some facial
expressions appear to be well represented across diverse
taxonomic groups, making them good models for under-
standing social and emotional function, while others appear
to be species-specific. The bared-teeth display, also referred
to as the fear grin, or grimace, is one of the most
conspicuous and well-studied facial expressions in ethology
and has been reported in a variety of mammalian species
from canids to primates. Research has shown, however, that
the communicative function of this expression can differ
quite broadly depending on the species, their type of social
organization and social context. In wolves, for example,
retraction of the lips horizontally over the teeth results in a
‘submissive grin’ which is used by cubs and subordinates
when actively greeting adult conspecifics, or humans
(Fox, 1969). Antithetical to this expression is a vertical lip
Received 18 August 2006; Accepted 18 September 2006
Support for this article was provided by RR-00165 from the NIH/NCRR to the Yerkes National Primate
Research Center, R01-MH068791 to L.A.P. Special thanks to Matt Heintz for assistance with Poser 6.0, Paul
Ekman for the use of photographic stimuli, and Ralph Adolphs, David Skuse and the Cold Springs Harbor
Institute for organizing the special conference on the Biology of Social Cognition where these ideas were
disseminated.
Correspondence should be addressed to Lisa A. Parr, Division of Psychiatry and Behavioral Sciences &
Yerkes National Primate Research Center, Emory University, Atlanta, GA, USA. E-mail: [email protected].
doi:10.1093/scan/nsl031 SCAN (2006) 1, 221–228
� The Author (2006). Published by Oxford University Press. For Permissions, please email: [email protected]
retraction which is given by dominant animals during
aggressive interactions, very similar facial movements
but with vastly different social functions (Darwin, 1872;
Fox, 1969).
Among primates, the function of the bared-teeth also has
different meanings depending on the species and their type
of social organization. Among macaques species that have
despotic social systems characterized by strict, linear
dominance hierarchies, i.e. rhesus monkeys, the bared-
teeth display appears to be a signal of submission, or rank
recognition in that it is only given by subordinates to higher
ranking individuals (van Hooff, 1976; de Waal and Luttrell,
1985). This expression has been referred to as a formal signal
of dominance in the rhesus monkey because it is highly
ritualized in appearance and has long-term predictability in
determining dominance relationships despite short-term
variation in social contexts (de Waal and Luttrell, 1985).
In this study, bared-teeth displays performed by subordinate
individuals occurred most often in response to the approach
of a dominant monkey, and the most frequent response was
for the subordinate to withdraw from any social interaction
(de Waal and Luttrell, 1985). However, the meaning of
the bared-teeth display is quite different when used by
species with more egalitarian social systems, including some
macaques, mandrills, Gelada baboons and chimpanzees
(van Hooff, 1967; Preuschoft and van Hooff, 1997).
In these species, the bared-teeth display is more appeasing
and functions to increase social attraction and affiliation.
It communicates benign intent in that the signaler
wishes no harm, and that there is no risk of aggression
(van Hooff, 1967; van Hooff, 1976; Waller and Dunbar,
2005). It can also occur during affiliative contexts, such as
grooming, sexual solicitation and reconciliations, and thus
functions to increase affiliative tendencies and reduce
proximity between individuals (van Hooff, 1973;
Preuschoft and van Hooff, 1997; Parr et al., 2005; Waller
and Dunbar, 2005).
Numerous authors have gone on to propose that the
bared-teeth display is homologous with the human smile,
meaning that they are the result of common evolutionary
descent (van Hooff, 1972; Preuschoft and van Hooff, 1997;
Waller and Dunbar, 2005). This conclusion is based in part
on the physical similarity in the appearance of the bared-
teeth display and the human smile, which are both
characterized by retraction of the lip corners exposing, in
most cases, the upper and lower teeth. However, the
homology is also based on similarity in the social function
of these expressions, indicating appeasement, reassurance,
increasing social bonding, and thus its important role in
facilitating social cohesion among primates (Preuschoft and
van Hooff, 1997). These examples suggest that while similar
appearing expressions can often have different meanings, or
serve different social functions, depending on the species and
their type of social organization, they share a common
evolutionary root. Therefore, in order to understand the
evolution of human emotional expressions, like the smile,
it is extremely informative to take a comparative approach
and evaluate whether the form and function of these
expressions are uniquely human, present in many related
species, or perhaps only shared by very closely related
species, like Hominoids (Fridlund, 1994). The picture is less
clear for other expressions, and only a few other direct
comparisons have been made, i.e. homology between human
laughter and the nonhuman primate ‘play face’ (Preuschoft
and van Hooff, 1967; van Hooff, 1972). This is undoubtedly
due to the fact that researchers, to date, have lacked a
common language, or standardized system that could be
used to identify potential homologues. Ideally, any assess-
ment of homology should be based on something other than
physical appearance, so a consideration of the underlying
musculature is important (Waller et al., 2006). Among
primates, it is highly likely that many expressions may share
a common ancestry, and thus a comparative treatment is
warranted, even if the expressions look different in terms of
their characteristic features.
DEVELOPMENT OF COMPARABLE CODING SYSTEMS
FOR PRIMATE FACIAL EXPRESSIONS
Recently, researchers have developed an objective, standar-
dized method for measuring facial movement in the
chimpanzee that is directly comparable to humans, thus
facilitating research on facial expression homologues (Vick
et al., in press; www.chimpfacs.com). This system, referred
to as ChimpFACS, is based on the well-known human Facial
Action Coding System, or FACS, developed by Ekman and
colleagues to provide an objective tool for studying the
biological basis for human expressions and emotion (Ekman
and Friesen, 1978). Both of these systems are unique in that
they identify the most minimal units of facial movement
according to the function of the underlying musculature.
Therefore, both systems enable the objective description of
facial appearance changes in the human face and the
chimpanzee face that are associated with movements of the
underlying facial musculature. Each movement is described
using a numeric code, referred to as an Action Unit (AU),
and researchers are trained to use these codes reliably
through a standardized testing process, making all users of
the system statistically reliable with one another (http://
face-and-emotion.com). There are several advantages to
using such a standardized system for comparative studies.
First, because the facial musculature of chimpanzees and
humans is highly comparable (Burrows et al., 2006; Waller et
al., 2006), the two systems provide a basis for understanding
homologous facial expression structure in the two species.
Second, by orienting attention towards individual facial
movements, researchers are encouraged to view the face
objectively and not be biased by the humans’ natural
tendency to focus on overall expression configuration
(Wallbott and Ricci-Bitti, 1993; Calder et al., 2000). Thus,
the face can be described, or coded, objectively using
222 SCAN (2006) L. A. Parr and B.M.Waller
minimal units of measurement without interference from its
holistic emotional quality. Finally, because the systems are
both standardized using a graded testing procedure,
researchers from different groups are now able to compare
their results directly using a common nomenclature. The
value of the system is obvious in the examples given above;when attempting to determine whether expressions are
homologous across chimpanzees and humans, like the
bared-teeth display and the smile, researchers can begin by
using subjective ratings of their similarity, but then utilize
the more objective, standardized assessment of their
anatomical similarity by comparing individual action units
in both species.
After the creation of the ChimpFACS, researchers wereinterested in testing its accuracy to determine whether this
bottom-up approach (action unit to expression configura-
tion) could actually validate the existing categories of
chimpanzee facial expressions made previously by experts
in chimpanzee communicative behavior. This was done
using a discriminate functions analysis (DFA) which enabled
a statistical comparison of whether the a priori expression
categories could be predicted by common patterns ofunderlying facial movements, or patterns of AUs, coded
from each expression using ChimpFACS. Over 250 facial
expressions were categorized by Parr using published
ethograms of chimpanzee behavior (Parr et al., 2005), and
then coded using ChimpFACS by Waller, a certified
ChimpFACS expert (Vick et al., in press). Nine expression
categories, and 15 action units, accounted for the 250þ
examples and this included an ambiguous category that was
used for expressions that could not easily be classified. The
results of the DFA showed a high percentage of agreementfor almost all of the expression categories (Parr et al., in
press). The only notable exception was the pout, which was
most often identified as a pant-hoot due to a similarity in
one AU movement (AU22-lip funneler) which was present
in both of these expressions. Therefore, objective coding of
naturally occurring chimpanzee facial expressions according
to minimal facial movements using ChimpFACS produced
functionally meaningful expression categories. Moreover, theanalysis enabled the identification of unique combinations of
muscle movements (AUs) for all expression categories. Thus,
the DFA provided a clear description of prototypical facial
expression configurations based on standardized muscle
action (Parr et al., in press). These configurations were
so predictive that naı̈ve researchers were able to
identify expression categories with over 80% accuracy
solely by reading the AU configuration (Parr et al., inpress). An illustration of each prototypical expression
configuration and the percentage of category agreement
between AU codes and a priori classifications can be seen
in Figure 1.
Fig. 1 An illustration of prototypical chimpanzee facial expressions. These are listed in pairs. The example on the left side of the pair shows the Poser animated expression,while the example on the right shows a naturalistic chimpanzee expression. Under the Poser expression is the prototypical AU configuration as identified by the DiscriminantFunctions Analysis, and under the naturalistic expression is the percentage agreement between AU configuration and a priori classification for that category.
Chimpanzee facial expression SCAN (2006) 223
THE STANDARDIZATION OF CHIMPANZEE FACIAL
EXPRESSIONS USING CHIMPFACS
The identification of prototypical expression configurations
is extremely important for ongoing research on chimpanzee
social cognition as it provides a blueprint for standardizing
the images used in computerized tasks, such as expression
categorization studies (Parr et al., 1998). Using the
results from the DFA, researchers now have a list of
prototypical facial configurations that account for each
major expression category, and an objective tool in
ChimpFACS for identifying these. Even with these tools,
however, acquiring photographs of chimpanzee expressions
at their peak intensity is extremely difficult because
expressions are dynamic, often occur during highly charged
social contexts where there is fast movement, and
subjects are more often than not faced away from the
photographer, making it difficult to capture frontal pictures,
or to standardize these in terms of head orientation
and posture.To overcome the difficulties in obtaining high quality
naturalistic photographs of chimpanzee expressions,
researchers have now turned to custom three-dimentional
(3D) animation software to create these configurations
manually. There are two types of expressions shown in
Figure 1, a naturalistic photograph and a cartoon-like
chimpanzee face. The cartoon-like face was created using
the 3D animation software Poser 6.0 (www.efrontiers.com),
which enables detailed, custom animation of facial move-
ments by allowing the user to impose an artificial muscle
structure onto the face and then animating this ‘muscle’ to
affect facial appearance in a highly controlled and natu-
ralistic manner. To illustrate, the Poser expressions in
the figure were made using anatomically accurate muscle
structure and realistic AU movements and are posed in the
prototypical facial configurations identified by the DFA
(Parr et al., in press). Because chimpanzee researchers cannot
position chimpanzee faces using verbal instruction, as is
the primary method for creating human expression
stimuli for use in emotion perception studies, the Poser
chimpanzee allows researchers control over head orientation,
gaze direction, expression configuration and intensity.
The application of these techniques will considerably
advance comparative cognition research as it eliminates
the aforementioned difficulties in creating controlled
stimulus libraries where each example expression is required
to be photographed at peak intensity, in full-frontal
posture from freely behaving chimpanzees. Moreover, in
combination with the ChimpFACS, researchers can use
these models to pose individual action units, realistic
combinations of action units, such as those seen
in Figure 1, and even artificial combinations of facial
movements to specifically examine which elements of
expressions are most meaningful for the chimpanzee’s
emotional perception.
COMPARING HOMOLOGOUS FACIAL MOVEMENTS IN
CHIMPANZEES AND HUMANS
With the development of a standardized system for
measuring facial movement in chimpanzees (ChimpFACS)
that is anatomically and organizationally comparable to
humans (FACS), the stage is set for a more comprehensive
investigation into the evolution of facial expressions and
facial emotion (Parr et al., in press; Vick et al., in press;
Waller et al., 2006). There are several ways to attempt such
an investigation and it will likely be quite some time before
there are enough data to make any definitive conclusions. At
this very preliminary state, one can reasonably describe
similarities in the facial movements between prototypical
chimpanzee expressions and human emotional expressions
(Parr et al., in press). In their study, these authors most
closely matched the AUs’ configurations of the chimpanzee
bared-teeth display and human smile, chimpanzee screams
to human screams, the chimpanzee bulging lip face to
human anger, chimpanzee laughter to human laughter, but
could not find a good human equivalent for the chimpanzee
pant-hoot. When humans have been asked to judge the
emotional quality of chimpanzee expressions using basic
emotions labels, the following comparisons have emerged:
chimpanzee bared-teeth display¼ happiness, chimpanzee
play face¼ happiness, chimpanzee pant-hoot¼ happy,
chimpanzee scream¼ anger (Fernandez-Carriba et al.,
unpublished data). Thus, in these very preliminary compar-
isons, similarities emerged.
An alternative, and arguably a more intuitive, approach is
to first match human and chimpanzee expressions based on
the muscular components, and then make inferences about
the emotional quality of human faces that are structurally
homologous to chimpanzee expressions. Figure 2 shows such
a comparison where the human images are taken directly
from the FACS manual (Ekman et al., 2002) and show facial
movements that are structurally homologous to the proto-
typical chimpanzee expressions. The identical AUs shared by
the two expression examples are highlighted in bold italics.
In some cases there is an extra movement in either the
human or chimpanzee example. Not surprisingly, one of the
most striking comparisons here is between the chimpanzee
bared-teeth display and its human equivalent. As discussed
earlier, previous researchers have suggested a homology
between the chimpanzee bared-teeth display and the human
smile (van Hooff, 1972; Preuschoft and van Hooff, 1995;
Waller and Dunbar, 2005; Parr et al., in press). However, the
human bared-teeth expression in Figure 2 strongly resembles
a grimace, or a forced smile, but it does not give the
countenance of happiness. Contrast this to the human play
face, which gives a strong impression of happiness and
supports others who have suggested that this expression is
homologous to laughter in humans (van Hooff, 1972;
Preuschoft and van Hooff, 1995; Waller and Dunbar, 2005;
Parr et al., in press). One explanation may be that genuine
human smiles, those associated with emotional happiness
224 SCAN (2006) L. A. Parr and B.M.Waller
or enjoyment, include an AU6-cheek raiser, which is missing
in the human bared-teeth expression (Ekman and Friesen,
1982; Ekman et al., 1990; Frank et al., 1993). Interestingly,
when people have been asked to rate human faces that
contain either individual movements (single AUs) or
combinations of movements (multiple AUs), researchers
report that the configuration AU10þ 12þ 16þ 25 (the same
movements as in the prototypical chimpanzee bared-teeth
face) is most often described as showing fear (Wallbott and
Ricci-Bitti, 1993). While it might seem unlikely that the
addition of a single AU (the AU6) could change the entire
emotional interpretation of an expression, the data strongly
support this conclusion. Because humans process facial
expressions configurally, and do not selectively attend to
individual features, the configuration AU10þ 12þ 16þ 25
is interpreted differently than AU6þ 10þ 12þ 25 (Wallbott
and Ricci-Bitti, 1993). If the production of AU6 does indeed
correlate with genuine happiness, then it is arguably very
important to distinguish between these two expressions,
and thus configural processing has a clear and crucial
function. Interestingly, the human scream face in Figure 2
contains an AU6, and it does not appear to have a
strong negative emotional countenance. These findings
provide a strong argument for the need to more fully
understand how chimpanzees process their own facial
expressions, what the role of configural processing is in
expression categorization and whether chimpanzees are as
sensitive to individual movements as humans. Therefore, in
order to more fully understand the evolution of commu-
nication and the social function of facial expressions, a
comparative perspective on facial expression categorization
is needed.
PREVIOUS STUDIES ON FACIAL EXPRESSION
CATEGORIZATION BY CHIMPANZEES
Studies of emotional communication have typically taken
one of two approaches, either they focus on the perception
of the display, how it is processed and an examination of
salient features or configuration necessary for accurate
interpretation, or they focus on the social function of the
expression, how it affects the behavior of a receiver within
a social environment. Ideally, it is the combination of
these two approaches that will provide the most complete
understanding of emotional communication in ongoing
social interactions. Therefore, two critical questions for
understanding the evolution of emotional communication
are; first, whether chimpanzees and humans use similar
perceptual cues to discriminate among facial expression
categories, and second, what is the functional outcome of
these expressions for ongoing social interactions. In addres-
sing the first approach, human studies have shown strong
configural preferences for both face identity recognition
and facial expression categorization (Calder et al., 2000;
Etcoff and Magee, 1992; Wallbott and Ricci-Bitti, 1993).
Configural cues refer to the relative size, shape and spatial
arrangement of features within the face (Maurer et al., 2002).
While numerous studies have demonstrated a strong
configural bias for face identity processing in the chimpanzee
(Parr et al., 2000; Parr et al., 2006; Parr and Heintz, in press),
the perceptual cues important for facial expression categor-
ization remain unclear (Parr et al., 1998).
Parr and colleagues have performed several studies to
address this first approach: how do chimpanzees discrimi-
nate facial expressions, and what are the relevant features
for this perceptual process (Parr et al., 1998;
Fig. 2 Prototypical chimpanzee facial expressions and homologous facial movements in a human (Ekman et al., 2002).
Chimpanzee facial expression SCAN (2006) 225
Parr and Maestripieri, 2003)? The goal of these studies was
first to determine whether chimpanzees could visually match
different examples of expression types and second, to try and
understand how these categorizations were achieved. Were
subjects, for example, extracting specific salient features, like
the presence of teeth, mouth position, etc., or were they
relying on the overall expression configuration, like humans?
These studies have been performed using a computerized
joystick-testing paradigm and matching-to-sample (MTS)
format. Subjects have had years of expertise with this testing
situation and eagerly participate in daily testing sessions
(Parr et al., 2000, Parr and de Waal, 1999). The basic format
for MTS is that subjects are first presented with a sample
stimulus, a conspecific facial expression, for example, and a
cursor on the computer screen over a black background.
They have been trained that when they can control the
movements of the cursor on the monitor by manipulating
the joystick. The sample stimulus is the image to match and
they first must orient towards it by contacting it with the
joystick-controlled cursor. After this, the sample stimulus
clears the screen and they are presented with two alternative
stimuli, one matches the sample on predetermined stimulus
dimension, i.e. expression type (target image), while the
other does not match (foil).
In a previous expression matching task, Parr and
colleagues (1998) presented chimpanzees with a sample
photograph showing an unfamiliar conspecific making one
of the six facial expressions, bared-teeth, pant-hoot, play
face, relaxed-lip face, scream and neutral. The target image
was another individual making the same facial expression, so
the matching pair of photographs was unique, and the foil
showed a third individual making a neutral expression.
Successful matching, therefore, required discrimination
based on expression type, not the identity of the individual
making the expression. On the first testing session, all six
chimpanzees performed significantly above chance (50%) on
the majority of the expression categories (bared-
teeth¼ 67%, play face¼ 68%, scream¼ 64%), demonstrat-
ing that not only are expressions highly salient, but that
expression type can be used spontaneously as a means for
discrimination (Parr et al., 1998).
In a follow-up to this study, Parr and colleagues (1998)
investigated how expressions were discriminated, using
overall configuration or the extraction of specific features.
To do this, each expression was characterized according to
the presence or absence of specific salient features using
subjective criteria, such as mouth open, eyes open, teeth
visible. This was done for five expression types, bared-teeth,
pant-hoot, play face, relaxed-lip and scream. In the
experimental task, each dyadic combination of expressions
was then paired together, totaling 20 different combinations
for the five expression categories. For half of the expression
pairs (N¼ 10), the target (match) and foil (non-match)
expressions shared three or more features in common, while
the other 10 pairs were very distinctive, sharing two or fewer
features in common. Figure 3 shows examples of a trial
where the expression pair shares features in common
(Figure 3A) or has little feature overlap (Figure 3B). The
performance on expression dyads that were similar versus
distinctive was then compared. The hypothesis was that if
the chimpanzees were extracting specific salient features,
such as teeth visible, when performing expression discrimi-
nations, their performance should be better on distinct dyads
compared with similar dyads. Moreover, a negative correla-
tion would be expected overall between the number of
shared features in each of the 20 expression dyads and
performance matching the expressions in those trials, i.e. the
more feature overlap, the worse performance would be. The
data supported the first prediction: overall, subjects’
performance was significantly better discriminating expres-
sion dyads that were distinct compared to similar,
t(4)¼ 2.94, P< 0.05. There was, however, little support for
the second prediction. The overall correlation between
shared features and performance was only weakly negative,
r¼�0.11, NS (Parr et al., 1998). In fact, the role of
distinctive, non-overlapping facial features in providing an
advantage for expression discrimination was only detected
for some expression categories and not others. Thus,
Fig. 3 An illustration of an expression matching trial where the expression pairs share similar features (A), or in which the expression pair contains distinctive features (B).
226 SCAN (2006) L. A. Parr and B.M.Waller
no conclusive evidence was found for or against configural
facial expression processing in this species.
One serious problem with these earlier studies was the
inability to objectively categorize each expression in terms of
its relevant features. With the creation of ChimpFACS and
the development of the standardized, prototypical Poser
expressions described earlier, the question of how chimpan-
zees process facial expressions, configurally or using feature-
based cues, can be re-addressed in a more controlled fashion
by manipulating action units individually and in combina-
tions and presenting these to subjects using similar
discrimination studies as described earlier. These studies
are currently underway at the Yerkes Primate Center and will
make an important contribution towards understanding the
evolution of emotional communication by addressing
similarities and differences in the perceptual discrimination
of facial expressions by chimpanzees compared to humans.
In conclusion, a great deal still remains to be learned
about the evolution of social signals, like facial expressions.
This article has reviewed and expanded on data suggesting
homologies between the primate bared-teeth display and the
human smile, and the primate play face and human laughter.
But to what extent are other facial expressions evolutionarily
similar across related species? Emphasis has also been placed
on the general lack of knowledge concerning the social
function of facial expressions in non-human primates. Only
a handful of studies have thus far been attempted, yet the
question of how these expressions function in a social
context is critical for a more complete understanding of their
evolutionary significance and continuity. Finally, this brief
review did not address issues of development or social
learning, i.e. how do primates learn the social function
and/or meaning of these expressions? Is there an innate
component, or is learning totally dependent on experience?
These developmental questions pertain to more than just
understanding the meaning of expressions, but also in
knowing how to produce expressions, as in appearance, so as
to convey appropriate information to conspecifics, and then
knowing when to use them in the appropriate social
contexts.
Tools such as the ChimpFACS provide an exciting new
methodology for understanding similarities and differences
in chimpanzee and human expressions by comparing
homologous facial movements. Thus, this new tool will
help facilitate comparisons of facial expressions according to
their structural similarities and enable researchers to expand
on whether facial expressions other than the bared-teeth
display and the play face may have structural homologs in
humans. Moreover, combining studies on the structural
similarity of these expressions with tasks that investigate
their perceptual classification will provide insights into
which features are important for expression categorization,
and how the cognitive processes that underlie the categor-
ization of these signals are indeed similar to what is known
about human expression categorization. This latter question
may tie in with recent work on human neuroimaging that
suggests that neural systems important for face identity
recognition and processing emotional content from facial
expressions are, in fact, dissociable (Haxby et al., 2000;
Kesler-West et al., 2001; Phan et al., 2004). Finally, sorely
lacking are studies that examine the social function of facial
expressions both in chimpanzees and in humans.
Understanding their function is critical for making any
inferences about evolutionary continuity, their emotional
meaning and the co-evolution of signaler and receiver to
maximize efficient communication within social groups.
Conflict of Interest
None declared.
REFERENCES
Andrew, R.J. (1963). The origin and evolution of the calls and facial
expressions of the primates. Behaviour, 20, 1–109.
Bolwig, N. (1978). Communicative signals and social behaviour of some
african monkeys: a comparative study. Primates, 19, 61–99.
Burrows, A.M., Waller, B.M., Parr, L.A., Bonar, C.J. (2006). Muscles
of facial expression in the chimpanzee (Pan troglodytes): descriptive,
ecological and phylogenetic contexts. Journal of Anatomy, 208, 153–68.
Calder, A.J., Young, A.W., Keane, J., Dean, M. (2000). Configural
informaiton in facial expression perception. Journal of Experimental
Psychology: Human Perception and Performance, 26, 527–51.
Darwin, C. (1872). The Expression of the Emotions in Man and Animals,
3rd edn. New York: Oxford University Press.
Dawkins, R., Krebs, J.R. (1978). Animal signals: information or manipula-
tion? In: Krebs J.R., Davies, N.B., editors. Behavioural Ecology:
An Evolutionary Approach. Oxford, UK: Blackwell Scientific
Publications, pp. 282–309.
de Waal, F.B.M. (1988). The communicative repertoire of captive bonobos
(Pan paniscus), compared to that of chimpanzees. Behaviour, 106,
183–251.
de Waal, F.B.M., Luttrell, L. (1985). The formal hierarchy of rhesus
monkeys; an investigation of the bared teeth display. American Journal
of Primatology, 9, 73–85.
Ekman, P. (1997). Should we call it expression or communication?
Innovations in Social Science Research, 10, 333–44.
Ekman, P., Davidson, R.J., Friesen, W.V. (1990). The Duchenne smile:
emotional expression and brain physiology, II. Journal of Personality and
Social Psychology, 58, 342–53.
Ekman, P., Friesen, W.V. (1978). Facial Action Coding System. California:
Consulting Psychology Press.
Ekman, P., Friesen, W. (1982). Felt, false and miserable smiles. Journal
of Nonverbal Behavior, 6, 238–52.
Ekman, P., Friesen, W.V., Hager, J.C. (2002). Facial Action Coding System.
Salt Lake City: Research Nexus.
Etcoff, N.L., Magee, J.J. (1992). Categorical perception of facial expressions.
Cognition, 44, 227–40.
Fox, M.W. (1969). A comparative study of the development of facial
expressions in canids; wolf, coyote, and foxes. Behaviour, 36, 4–73.
Frank, M.G., Ekman, P., Friesen, W.V. (1993). Behavioral markers and
recognizability of the smile of enjoyment. Journal of Personality and Social
Psychology, 64, 83–93.
Fridlund, A.J. (1994). Human facial expression – An evolutionary view.
London: Academic Press.
Goodall, J. (1968). A preliminary report on expressive movements and
communication in the Gombe Stream chimpanzees. In: Jay, P.C., editor.
Primates: Studies in Adaptation and Variability. New York: Holt, Rinehart
and Winston, pp. 313–519.
Chimpanzee facial expression SCAN (2006) 227
Haxby, J.V., Hoffman, E.A., Gobbini, M.I. (2000). The distributed
human neural system for face perception. Trends in Cognitive Science,
4, 223–33.
Hinde, R.A., Rowell, T.E. (1962). Communication by postures and facial
expressions in the rhesus monkey (Macaca mulatta). Proceedings of the
Royal Society of London B, 138, 1–21.
Kesler-West, M.L., Andersen, A.H., Smith, C.D., et al. (2001). Neural
substrates of facial emotion processing using fMRI. Cognitive Brain
Research, 11, 213–26.
Maurer, D., Le Grand, R., Mondloch, C. J. (2002). The many faces
of configural processing. Trends in Cognitive Science, 6, 255–60.
Parr, L.A., de Waal, F.B.M. (1999). Visual kin recognition in chimpanzees.
Nature, 399, 647–48.
Parr, L.A., Maestripieri, D. (2003). Nonvocal communication in nonhuman
primates. In: Maestripieri, D., editor. Primate Psychology: The Mind
and Behavior of Human and Nonhuman Primates. Chicago: Chicago
University Press, pp. 324–358.
Parr, L.A., Cohen, M., de Waal, F.B.M. (2005). The influence of social
context on the use of blended and graded facial displays in
chimpanzees (Pan troglodytes). International Journal of Primatology, 26,
73–103.
Parr, L.A., Heintz, M. (in press). The perception of unfamiliar faces and
houses by chimpanzees: influence of rotational angle. Perception.
Parr, L.A., Heintz, M., Akamagwuna, U. (2006). Three studies of configural
face processing by chimpanzees. Brain and Cognition, 62, 30–42.
Parr, L.A., Hopkins, W.D., de Waal, F.B.M. (1998). The perception of
facial expressions in chimpanzees (Pan troglodytes). Evolution of
Communication, 2, 1–23.
Parr, L.A., Waller, B.M., Vick, S.J., Bard, K.A. (in press). Classifying
chimpanzee facial expressions using muscle action. Emotion.
Parr, L.A., Winslow, J.T., Hopkins, W.D., de Waal, F.B.M. (2000).
Recognizing facial cues: Individual recognition in chimpanzees
(Pan troglodytes) and rhesus monkeys (Macaca mulatta). Journal
of Comparative Psychology, 114, 47–60.
Parr, L.A., Preuschoft, S., de Waal, F.B.M. (2002). Afterword: research on
facial emotion in chimpanzees, 75 years since Kohts. In: de Waal, F.B.M.,
editor. Infant Chimpanzee and Human Child. New York: Oxford
University Press, pp. 411–52.
Phan, K.L., Taylor, S.F., Welsh, R.C., Ho, S., Britton, J.C., Liberzon, I.
(2004). Neural correlates of individual ratings of emotional salience:
a trial-related fMRI study. NeuroImage, 21, 768–80.
Preuschoft, S., van Hooff, J.A.R.A.M. (1997). The social function of ‘‘smile’’
and ‘‘laughter’’: variations across primate species and societies.
In: Segerstrale, U., Mobias, P., editor. Nonverbal Communication: Where
Nature Meets Culture. New Jersey: Erlbaum, pp. 252–81.
Preuschoft, S., van Hooff, J.A.R.A.M. (1995). Homologizing primate facial
displays: a critical review of methods. Folia Primatologica, 65, 121–37.
Redican, W.K. (1982). An evolutionary perspective on human facial
displays. In: Ekman, P., editor. Emotion in the Human Face. New York:
Cambridge University Press, pp. 212–80.
Smith, W.J. (1977). The Behaviour of Communication. An Ethological
Approach. Cambridge: Harvard University Press.
van Hooff, J.A.R.A.M. (1962). Facial expressions in higher primates.
Symposia of the Zoological Society of London, 8, 97–125.
van Hooff, J.A.R.A.M. (1967). The facial displays of the Catarrhine monkeys
and apes. In: Morris, D., editor. Primate Ethology. Chicago: Aldine,
pp. 7–68.
van Hooff, J.A.R.A.M. (1972). A comparative approach to the phylogeny
of laughter and smiling. In: Hinde, R.A., editor. NonVerbal
Communication. Cambridge: Cambridge University Press, pp. 209–40.
van Hooff, J.A.R.A.M. (1973). A structural analysis of the social behavior of
a semi-captive groups of chimpanzees. In: Cronach, M.V., Vine, I., editor.
Social Communication and Movement. London: Academic Press,
pp. 75–162.
van Hooff, J.A.R.A.M. (1976). The comparison of the facial expressions
in man and higher primates. In: von Cranach, M., editor. Methods of
inference from animal to human behavior. Chicago: Aldine, pp. 165–96.
Vick, S.J., Waller, B.M., Parr, L.A., Pasqualini-Smith, M., Bard, K.A.
(in press). A cross species comparison of facial morphology and
movement in humans and chimpanzees using FACS. Journal of
Nonverbal Behavior.
Wallbott, H.G., Ricci-Bitti, P. (1993). Decoders’ processing of emotional
facial expression – a top-down or bottom-up mechanism? European
Journal of Social Psychology, 23, 427–443.
Waller, B., Dunbar, R.I.M. (2005). Differential behavioural effects of silent
bared teeth display and relaxed open mouth display in chimpanzees
(Pan troglodytes). Ethology, 111, 129–42.
Waller, B.M., Vick, S.J., Parr, L.A., et al. (2006). Intramuscular electrical
stimulation of facial muscles in humans and chimpanzees: Duchenne
revisited and extended. Emotion, 6, 367–82.
Weigel, R.M. (1979). The facial expressions of the brown capuchin monkey
(Cebus apella). Behaviour, 68, 250–76.
228 SCAN (2006) L. A. Parr and B.M.Waller
Classifying Chimpanzee Facial Expressions Using Muscle Action
Lisa A. Parr,Yerkes National Primate Research Center, and Department of Psychiatry and Behavioral Sciences,Emory University
Bridget M. Waller,Centre for the Study of Emotion, Department of Psychology, University of Portsmouth, Portsmouth,Hampshire, United Kingdom
Sarah J. Vick, andDepartment of Psychology, University of Stirling, Scotland, United Kingdom
Kim A. BardCentre for the Study of Emotion, Department of Psychology, University of Portsmouth, Portsmouth,Hampshire, United Kingdom
AbstractThe Chimpanzee Facial Action Coding System (ChimpFACS) is an objective, standardizedobservational tool for measuring facial movement in chimpanzees based on the well-known humanFacial Action Coding System (FACS; P. Ekman & W. V. Friesen, 1978). This tool enables directstructural comparisons of facial expressions between humans and chimpanzees in terms of theircommon underlying musculature. Here the authors provide data on the first application of theChimpFACS to validate existing categories of chimpanzee facial expressions using discriminantfunctions analyses. The ChimpFACS validated most existing expression categories (6 of 9) and,where the predicted group memberships were poor, the authors discuss potential problems withChimpFACS and/or existing categorizations. The authors also report the prototypical movementconfigurations associated with these 6 expression categories. For all expressions, uniquecombinations of muscle movements were identified, and these are illustrated as peak intensityprototypical expression configurations. Finally, the authors suggest a potential homology betweenthese prototypical chimpanzee expressions and human expressions based on structural similarities.These results contribute to our understanding of the evolution of emotional communication bysuggesting several structural homologies between the facial expressions of chimpanzees and humansand facilitating future research.
Keywords
ChimpFACS; facial expressions; emotion; homology
Research over the past half century has considerably advanced our understanding of facialexpressions and emotional communication in animals. This includes detailed descriptions of
Copyright 2007 by the American Psychological Association
Correspondence concerning this article should be addressed to Lisa A. Parr, Yerkes National Primate Research Center, 954 GatewoodRoad, Atlanta, GA 30329. [email protected].
Portions of this article were presented on September 13, 2005, at the 11th European Conference on Facial Expression, Durham, UnitedKingdom. We thank Frans de Waal, Living Links Center, Emory University, and the Chester Zoo, Chester, United Kingdom, for the useof some photographic material. The Yerkes National Primate Research Center is fully accredited by the American Association forAccreditation of Laboratory Animal Care.
NIH Public AccessAuthor ManuscriptEmotion. Author manuscript; available in PMC 2010 February 22.
Published in final edited form as:Emotion. 2007 February ; 7(1): 172–181. doi:10.1037/1528-3542.7.1.172.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
expression repertoires for a variety of primate and nonprimate species, including chimpanzees,bonobos, rhesus monkeys, capuchin monkeys, and canids (Andrew, 1963; Bolwig, 1964; deWaal, 1988; Fox, 1969; Goodall, 1986; Hinde & Rowell, 1962; Preuschoft & van Hooff,1995, 1997; Redican, 1975; van Hooff, 1962, 1967, 1973; van Lawick-Goodall, 1968; Weigel,1979). In the tradition of ethology, these facial expression ethograms include a name for thebehavior, a description of its appearance, any accompanying vocalization, and perhaps someinformation about how it is used and/or its presumed social function. Parr and colleagues(2005), for example, published a comprehensive ethogram of chimpanzee facial displays thatprovides an objective description of the display, photograph, and spectrogram of anyaccompanying vocalization and describes their social context of use. Moreover, they cross-referenced the names for these displays with existing accounts from other investigators in anattempt to standardize descriptions and nomenclature. While there is little disagreement thatthe use of such ethograms has advanced our understanding of animal behavior (Tinbergen,1963), there has been little formal attempt to standardize ethograms for the same species acrossresearch groups or any attempt to compare specific behavior across species. Therefore, the lackof a common metric for describing facial expressions has prevented a complete understandingof the evolution of communicative signals or their homology, both within and across species.
Facial expressions are difficult to describe accurately, even in humans (Ekman, Friesen, &Hager, 2002a), and this is especially challenging given their dynamic nature. Studies of humanfacial expressions, in contrast to those of animals, have been aided by the development of anobjective and standardized observational tool for describing facial movement, the Facial ActionCoding System (FACS; Ekman & Friesen, 1978; Ekman et al., 2002a). The FACS is ananatomically based coding system, with each independent unit of observable movement on theface (referred to as action units, or AUs) described in terms of how the underlying muscleaction alters the appearance of facial landmarks. Despite the fact that the FACS is now over30 years old, its standardization across research groups has been enforced through a refereedcertification process, maintaining it as the gold standard for describing facial action regardlessof the specific research application (see Rosenberg & Ekman, 1997). Moreover, because of itsanatomical basis and comprehensive approach to facial movements, the FACS is particularlywell suited for modification for use with other populations, such as human infants (BabyFACS;Oster, 2001) and nonhuman primates (Vick et al., 2007).
To date, several nonhuman primate studies have attempted to incorporate various aspects ofthe FACS to better describe facial movement and to provide a more standardized method foridentifying and comparing facial expressions across species. Steiner and colleagues, forexample, examined facial responses to tastes in numerous species of nonhuman primatesincluding prosimians, New World monkeys, Old World monkeys, and hominoids (Steiner &Glaser, 1984, 1995; Steiner, Glaser, Hawilo, & Berridge, 2001). Facial reactions were codedusing detailed etho-grams of facial movements, including components of the FACS andethological descriptions of primate facial expressions in the published literature (Andrew,1963). In this way, they were able to compare specific facial reactions to tastes across speciesusing standardized terminology. In addition, Preuschoft and van Hooff (1995) used FACS todescribe similarity in the appearance of the silent bared-teeth displays in related macaquespecies. Without detailed analysis of the muscular basis of these movements, however, thesepioneering studies only provide a subjective impression of the facial behaviors, a criterion thatis not sufficient to demonstrate structural homology.
Recently, we have developed a chimpanzee version of the FACS (Vick, Waller, Parr, Smith-Pasqualini, & Bard, in press). This was an extensive multidisciplinary project that began firstwith facial dissections of several chimpanzees in order to verify the presence and location ofthe mimetic facial muscles. These dissections confirmed the presence of all mimetic facialmuscles in the chimpanzees that have been reported in humans (Burrows, Waller, Parr, &
Parr et al. Page 2
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
Bonar, 2006). Next, we conducted intramuscular stimulations of the mimetic facial muscles inboth humans and chimpanzees to compare the muscular basis of chimpanzee and human facialmovements (Waller, et al., 2006). The morphology of the chimpanzee face is quite differentfrom the human face, making it necessary to confirm the presence of similar facial appearancechanges when specific muscles were stimulated (see Figure 1). The final stage was to observehundreds of photos and videos of chimpanzees performing naturally occurring facialmovements and document the presence of similar facial appearance changes as have beenreported in the human FACS. These facial movements were then labeled and described in amanner comparable with the human FACS, taking into consideration differences in the facialmorphology (Vick et al., 2007). The culmination of this research has confirmed that theChimpFACS describes facial movement on the basis of the underlying musculature and howit functions to change the appearance of the chimpanzee face.
The goals of this study are twofold. First, this article aims to determine whether ChimpFACScan validate the previous classification of chimpanzee facial displays, despite their often gradedand blended appearance (see Parr et al., 2005). Second, one of the most important and excitingapplications for the ChimpFACS is the potential for making comparisons of facial expressionsbetween chimpanzees and humans, on the basis of their homologous musculature. This rigorousand objective tool will provide a necessary first step toward assessing the homology ofemotional expressions in these two hominoid species, and will be critical in aiding future workto assess the emotional quality of these expressions and their potential social and emotionalfunction.
Method
Stimuli and Classification
Facial displays came primarily from the large database of chimpanzee facial expressionsmaintained by Lisa A. Parr at the Yerkes National Primate Research Center. The majority ofthese photographs were taken using a digital camera (Canon EOS Rebel), but many werescanned from photographs or extracted from digital video using Adobe Premiere (single framescaptured at 30 fps [frames per second]). These facial displays occurred naturally during socialinteractions and were never provoked by experimenters. All expressions in the database wereused, unless it could be determined that the movement was nonexpressive, that is, the animalwas in the process of feeding, yawning, and so forth, but some of these examples were includedin an ambiguous category, described below. The only other exclusionary criterion was if thehead angle prevented accurate ChimpFACS coding; however, this represented only a handfulof examples, as these photographs would not typically be included in the database.
Once images had been selected, the analyses included over 250 facial displays (N = 259) fromapproximately 100 different individuals (87 individuals were identified by name and >10 otherswere not identified by name). These individuals lived in several colonies housed in the UnitedStates, several in the United Kingdom, and some from other colonies across Europe. Thedisplays were categorized by expression type by Lisa A. Parr using published guidelines,resulting in nine major categories (Parr et al., 2005). The Parr et al. (2005) ethogram is uniquein that it cross-references descriptions given to chimpanzee facial behaviors by other expertsin the field and thus represents the most comprehensive published description of chimpanzeefacial displays available to date. Some of these other sources included Goodall (1986), vanHooff (1967, 1973, Marler (1965, 1976, Marler and Tenaza (1976), Yerkes (1943), and Yerkesand Learned (1925). These categories include the bared-teeth display (n = 41), relaxedopenmouth face or play face (n = 31), pant-hoot (n = 56), ambiguous faces (n = 24), neutralfaces (n = 15), scream (n = 34), alert face (n = 18), pout (n = 29), and whimper (n = 11). Wealso chose to include an ambiguous category, which contained faces that were difficult to isolateinto any one discrete category as they were too graded in intensity, were produced during
Parr et al. Page 3
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
feeding, or were other nonexpressive movements, such as yawns. Also included in this categorywere funny faces—unusual facial movements, often asymmetrical, produced idiosyncraticallyby only a select number of individuals.
Procedure
All facial stimuli were delivered to Bridget M. Waller for ChimpFACS coding. Still imageswere compared with a neutral image whenever possible, and each image was screened at leasttwice (upper and lower face) to assign AUs. We did not include AU categories describing heador eye position or vocalizations. To assess interobserver reliability of the coding, Sarah J. Vickadditionally coded 76 of the 259 displays (~30%). Waller and Vick are at present the only twocertified ChimpFACS coders (Bakeman & Gottman, 1986). Reliability was determined foreach facial expression using the following equation, as recommended in the human FACSmanual (Ekman, Friesen, & Hager, 2002b):
This calculation gives an agreement for each expression between 0 and 1 (0 = no agreement,and 1 = absolute agreement). The average agreement for the 76 displays was 0.78, which isabove the criteria required to become a certified human FACS coder (0.70).
The expressions were coded using the entire ChimpFACS manual, not an a priori list of AUs.However, after the coding had been completed, there were 15 AUs that identified all theexpression examples. Therefore, the following AUs were considered in subsequent analyses:AU1 + 2 = inner and outer brow raiser; AU6 = cheek raiser; AU9 = nose wrinkle; AU10 =upper lip raiser; AU12 = lip corner puller; AU16 = lower lip depressor; AU17 = chin raiser;AU19 = tongue protrusion; AU22 = lip funneler; AU24 = lip presser; AU25 = lips parted;AU26 = jaw drop; AU27 = mouth stretch; AU28 = lip suck; and AU160 = lower lip relax (anAU unique to the ChimpFACS). Most of these AUs are listed in Figure 1 according to theirmuscular origin and insertion on a chimpanzee face and its equivalent on a human face. Itshould be noted that some AUs cannot occur together; for example, the AU26 and AU27 aremutually exclusive. Given the variation in number of AUs present and possible combinationsof movements, incompatible AUs notwithstanding, there are many possible combinations thatcan be formed from these 15 AUs.
Data Analysis
The data were entered into an Excel spreadsheet listing the stimulus name, expression category(as defined by Lisa A. Parr), and the presence or absence of any the 15 possible AUs includedin the ChimpFACS that describe muscle movements. A discriminant functions analysis (DFA)was then performed to identify similarities and differences in action unit combinations acrossthe nine expression categories. This analysis identifies a linear combination of quantitativepredictor variables, represented as one or more discriminant functions (N = 1 canonicaldiscriminant functions, one less than the number of independent variables entered into theanalysis), that best characterize differences across groups of dependent variables, in this case,muscle action or AUs. In other words, the analysis provides a determination of whether our apriori category assignments for expressions would be predicted on the basis of the AUcombinations. The present analysis also used a leave-one-out classification procedure, in whichfunctions are generated from a sample of cases for which group membership is known (a prioricoded expression categories) and then applied as a model to variables for which groupmembership is not known. This cross-validation of predicted groups is used to determine theoverall success of the model. This approach was used to address the first aim of this article,
Parr et al. Page 4
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
whether ChimpFACS can validate the categories used by chimpanzee experts to classify facialexpressions. The category assignment probabilities generated by the DFA were then used toidentify the most prototypical combinations of AUs for each expression category. We definedthese as the images that achieved over 90% correct assignment to a particular expressioncategory. All statistics were performed using SPSS (Version 11.5.1, SPSS Inc., Chicago).
Our final aim was to present a potential framework for assessing the homology betweenchimpanzee and human facial expressions. This was done by creating a table that illustratesrepresentative examples of human facial expressions (taken from Martinez & Benavente,1998) that were matched to chimpanzee facial expressions according to their similar FACScodes (using FACS and ChimpFACS). Therefore, the resulting table provides a suggestiveframework for homology based on similar structural (muscular) morphology. Such aframework may then be used as a guide in future studies aimed at understanding the emotionalsimilarity and social function of these expressions and, ultimately, to further understand theevolution of human emotion.
Results
Facial Expression Categories Using ChimpFACS
DFA, with expression category as the grouping factor, identified eight standardized canonicalfunctions that accounted for 69.6%, 12.5%, 7.1%, 6.2%, 2.3%, 1.4%, 0.7%, and 0.1% of thevariance, respectively. The eigenvalues ranged from 8.48 for Function 1 to 0.01 for Function8. These first two functions accounted for over 82% of the cumulative variance. Wilks’s lambdarevealed significant differences across the means of each discriminant function, $2(112) =1,198.403, p < .0001.
The average correct assignment of expression categories was 71.0%, with a cross-validatedassignment of 67.6%. This cross-validation procedure classifies each case in the analysisaccording to the functions derived from all other cases, as previously described, so it can beused to assess the success of the model. The percentage of expressions correctly assigned toeach of the nine categories was as follows: bared-teeth display = 70.7%; relaxed open-mouthface = 87.1%; pant-hoot = 94.6%; ambiguous = 16.7%; neutral = 40.0%; scream = 97.1%; alertface = 50.0%; pout = 24.1%; and whimper = 63.6% (see Table 1 for the overall predictedclassifications). Table 2 lists the number of AUs observed for each expression category,providing some detail on the range of possible AUs involved in each expression.
Determining Prototypical Facial Expressions From AUs
To identify which AU combinations best represented a particular chimpanzee expressioncategory, we further analyzed expressions that showed over 90% probability of correctassignment to each of the nine groups, according to the DFA described above. Theserepresented expressions with the most reliably coded AU configurations and can thus beconsidered the most prototypical configurations for each facial expression category. Thefollowing section also lists the number of expressions in each category originally included inthe analysis and the number that agreed upon by both the DFA and the human rater (Lisa A.Parr). The readers are referred to Table 1 for information pertaining to what classificationsresulted when the agreements were not high.
Bared-Teeth Displays
The DFA identified 34 bared-teeth displays from the original set of 41 expressions classifiedby Lisa A. Parr. Of these, 33 were classified as having greater than 90% probability of correctassignment to the bared-teeth category. Twenty-nine1 of the 34 (88% agreement) were alsoidentified by Parr as bared-teeth displays. Six AUs were included in these expressions with the
Parr et al. Page 5
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
following percentage of occurrence: AU6 = 18.18%; AU10 = 75.76%; AU12 = 100%; AU16= 57.78%; AU25 = 100%; and AU26 = 18.18%. On the basis of this information, the bared-teeth display was best identified in two configurations: AU10 + 12 + 25 and AU10 + 12 + 16+ 25. In more descriptive terminology, the first configuration consists of an open mouth (AU26)with lips parted (AU25), a raised upper lip (AU10), and retracted lip corners (AU12)functioning to expose the teeth. The second configuration is the same, except the lower lip isalso depressed (AU16), exposing more of the bottom teeth. Figure 2 shows an example of eachof these bared-teeth configurations. Previous studies have found that the bared-teeth displayis most typically used when contact is initiated, such as approaches, embraces, invitations,play, and in response to aggression (Parr et al., 2005). In line with these social contexts, Wallerand Dunbar (2005) have recently suggested that the bared-teeth display is a signal of benignintent in that it functions to reduce uncertainty in a variety of both aggressive and affiliativesituations and increase the likelihood of affiliative behavior.
Play Faces
The DFA identified 39 play faces from an original set of 31. Of these, 32 were identified ashaving greater than 90% correct assignment to the category of play face. Twenty-five of these(78%) were also identified by Lisa A. Parr as play faces. Six AUs were included in theseexpressions with the following percentages of occurrence: AU12 = 100%; AU16 = 9.38%;AU19 = 3.13%; AU25 = 100%; AU26 = 62.5%; and AU27 = 37.5%. On the basis of thisinformation, the play face is best defined as two configurations, AU12 + 25 + 26 and AU12 +25 + 27. These are mutually exclusive combinations as AU26 and AU27 are never codedtogether because they each describe actions of lowering the jaw, jaw drop, and jaw stretch,respectively. The descriptive interpretation is that the play face is lip corners stretched (AU12)with mouth open (AU26), or mouth stretched wide open (AU27), and lips parted (AU25) inboth cases. Figure 2 shows an example of each play face configuration. Previous studies haveshown that the play face (relaxed open-mouth face) is used almost exclusively in the socialcontext of play, and more often by subadults than by adult individuals (Parr et al., 2005;vanHooff, 1973) and is associated with longer play bouts when used by both play interactants(Waller & Dunbar, 2005).
Pant-Hoots
The DFA identified 81 pant-hoots from an original sample of 56. Only 11 of these, however,identified as having greater than 90% correct assignment as pant-hoots. All of these 11 werealso categorized by Lisa A. Parr as pant-hoots, and they consisted of AU22 + 25 + 26. Themajority of the pant-hoots identified by the DFA (n = 68) showed a slightly weaker probabilityof category assignment falling between 89% and 60%. These were all characterized by AU22+ 25. The further description of these movements is that the lips are funneled (AU22) andparted (AU25) either with an open mouth (AU26) or without. Figure 2 shows these two pant-hoot configurations. Previous studies have shown that pant-hoots are used in situations ofgeneral excitement, such as in response to noise at a distance, distress, bluff displays, play, andduring piloerection (Parr et al., 2005;van Hooff, 1973). Moreover, data from field studies havesuggested that the pant-hoot is a long-distance signal that may communicate excitement overfood availability (Mitani, Gros-Louis, & Macedonia, 1996).
1The data in Table 1 describes the agreement (N = 29) between the initial classifications provided by Lisa A. Parr and the classificationsidentified by the analysis based on action unit combinations. Thus, for the bared-teeth display, there were 29 expressions in agreementout of an original set of 41 (29/41 = 70.7%). The identification of prototypical action unit combinations was based on the top 90% ofexpressions identified by the analysis as belonging to that category; therefore, these numbers are not identical.
Parr et al. Page 6
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
Ambiguous Facial Displays
We thought it would be interesting to see how ChimpFACS classified expressions that werevery difficult for us to classify, including those that consisted of other types of facial movementslike funny faces and yawns. The DFA identified 9 ambiguous faces, and all of these wereidentified as being correctly assigned to the ambiguous category above 90% probability. LisaA. Parr agreed with the ambiguous label for 89% of these, from the original sample of 24.There were, however, no specific combinations of AUs identified for these faces byChimpFACS, as was predicted. The AUs involved in these faces included 1 AU6 (cheek raiser),3 AU9s (nose wrinkle), 2 AU10s (upper lip raiser), 4 AU12s (lip corner puller), 1 AU16 (lowerlip depressor), 2 AU17s (chin raiser), 1 AU19 (tongue protrusion), 4 AU22s (lip funneler), 2AU24s (lips pressed), 6 AU25s (lips parted), 1 AU26 (jaw drop), and 1 AU27 (mouth stretch).
Screams
The DFA identified 41 screams from an original sample of 34, and all of these were classifiedas having greater than 90% correct assignment to the scream category. Eighty percent of thesewere also categorized by Lisa A. Parr as screams. Six AUs were included in these expressionswith the following percentage of occurrence: AU6 = 4.88%; AU10 = 100%; AU12 = 100%;AU16 = 75.61%; AU25 = 95.12%; and AU27 = 100%. On the basis of this information, thescream face was best defined as AU10 + 12 + 16 + 25 + 27. This is described as a raised upperlip (AU10) with lip corners pulled back (AU12) exposing the upper teeth, lower lip depressed(AU16) also exposing the lower teeth, and mouth stretched wide open (AU27) with lips parted(AU25). Figure 2 shows the scream configuration. Previous studies have demonstrated thatchimpanzee screams are used in a variety of social contexts characterized by nervousness, fearand distress, a prolonged response to aggression, during sex, fleeing an attacker, in some playcontexts, and in contexts of general agonism. Recently, several authors have suggested thatthere may, in fact, be several distinct forms of screams that can be identified using acousticanalyses. There is a more submissive scream used by victims in a fight or by recipients ofcontact aggression, and there is a more aggressive scream that is used by the attacker or pursuerin a conflict (Siebert & Parr, 2003;Slocombe & Zuberbuhler, 2005). To date, it is unknownwhether these acoustic categories may also be identified in the visual domain. For the purposesof this study, all screams were considered the same.
Alert Faces
The DFA identified 24 alert faces from an original sample of 18, and all of these were classifiedas having greater than 90% correct assignment to the alert face category. These wererepresented by AU25. Only 46% of these were categorized by Lisa A. Parr as alert faces. Thisproved to be a difficult category both for us and for the analysis. There were six additionalAUs identified for this expression category with the following percentage of occurrence: AU10= 4.17%; AU12 = 4.17%; AU16 = 20.83%; AU26 = 20.83%; AU27 = 12.5%; and AU160 =20.83%. Thus, a strict interpretation would give a prototypical configuration for the alert faceas only AU25, but the following combinations were also common, AU16 + 25 and AU25 +26, or AU25 + 160 or AU16 + 25 + 26. This is described as lower-lip depressed (AU16) andlips parted (AU25), lips parted (AU25) with mouth open (AU26), or lips parted (AU25) withlower lip relaxed (AU160). Figure 2 shows the AU16 + 25 configuration. Previous studieshave shown that the alert face is used most often during play, bluff displays, and in responseto aggression, all situations in which there may be some element of anxiety or uncertainty(Parr et al., 2005).
Pouts
The DFA identified 14 pout faces from an original sample of 29, 9 of which were identifiedas having greater than 90% correct assignment to the pout category. All of these had AU22
Parr et al. Page 7
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
and only 2 (11.11%) also had AU17. Seventy-eight percent of these were categorized by LisaA. Parr as pouts. However, as for the pant-hoot, the majority of these expressions wereclassified by the DFA as having less than 90% predicted membership to the pout category.Five expressions were classified between 96% and 50%, and these all consisted of AU17 + 22+ 25. The remaining expressions had only weak probability of assignment to the pout category(<40%), and when these were examined individually, they were all found to be classified aspant-hoots by the analysis. Therefore, the most prototypical combination of AUs to identifythe pout and, in particular dissociate it from the pant-hoot, was AU17 + 22 + 25 (see Figure2). This is described as chin raised (AU17) with lips funneled (AU22) and lips parted (AU25).Previous studies have shown that pouts are used during the contexts of embraces, invitations,play, approaches, and in response to aggression (Parr et al., 2005;van Hooff, 1973). Therefore,pouts may represent a need for contact, or reassurance, and physical affinity.
Whimpers
The DFA identified 8 whimper faces from an original sample of 11, and all of these wereidentified as having greater than 90% correct assignment as whimpers. Lisa A. Parr agreedwith 88% of these classifications. The expressions all contained AU12 + 22 + 25. Threewhimpers each had an additional AU, AU6, AU10, and AU160, and two had AU16. Therefore,the prototypical configuration for the whimper was AU12 + 22 + 25. This is described as lipcorners retracted (AU12) and lips funneled (AU22) with lips parted (AU25). Figure 2 showsthe configuration of all AUs. Previous studies have shown the stretch pout whimper to beassociated with response to aggression, signaling nervousness or fear in tense situations (Parret al., 2005). van Hooff (1973) clusters the whimper with affinitive behaviors, including thesilent bared-teeth face, so it might function to induce succor from others.
Post Hoc Validation of AU Configurations
As a final validation of the uniqueness of these configurations and their predictive value, wewere able to acquire a second set of images (>70) subsequent to the original submission of thisarticle. These were ChimpFACS coded by Bridget M. Waller, and an Excel sheet listing theimage name and ChimpFACS codes was then given to Lisa A. Parr to interpret using theprototypical AU configurations described above. For example, an image with the codes AU10+ 12 + 25 would be given the code of bared-teeth display. The list was also given to a secondindividual who had no knowledge of ChimpFACS, human FACS, or the purpose of theexercise. After this, Lisa A. Parr visually categorized each image and calculated a percentcorrect based on the classifications objectively arrived at from the AU configurations alone.This agreement for Lisa A. Parr’s categorizations was 83.6%, and for the second naïve rater,it was 80.8%. Thus, a human rater with no knowledge of how the coding system worked couldidentify them accurately >80% of the time solely by referencing these AU configurations.
Comparing Chimpanzee and Human Facial Expressions
One of the main applications of the ChimpFACS is to aid in understanding the evolution ofemotional communication by facilitating comparisons between human and chimpanzee facialexpressions. This has, to this point, only been done using subjective morphological descriptions(i.e., Chevalier-Skolnikoff, 1973; Ladygina-Kohts, 1935/2002). The final aim of this study wasto compare expressions in these two species by grouping them according to shared AUs, thuspresenting a suggested homology based on the structure, or muscular basis, of the expression.To meet this goal, we constructed Table 3. This shows several chimpanzee facial expressionslisted with their AU codes derived from this study, and a comparable human facial expressionwith its AU codes (photos were taken from Martinez & Benavente, 1998). Codes shared byeach expression type are in bold type. Thus, the two sets of facial expressions have been codedfor homologous facial movement using the two comparable FACS systems. We have included
Parr et al. Page 8
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
the bulging lip face as a potential homologue of human anger (based on ChimpFACS codes),although this category was not included in the DFA analysis because of small sample sizes.This table is descriptive in that it presents the potential application of FACS and ChimpFACSin understanding homologous facial expressions of emotion in these two species.
Discussion
The continuity of facial behavior across primates has long been considered in terms ofunderlying musculature (Huber, 1931), expressiveness (Darwin, 1872; van Hooff, 1967), andemotional valence (Ladygina-Kohts, 1937/2002). The relative importance of these differentaspects for the assessment of homology is still debated, but we propose that it is insufficientto look only at one level of analysis; for example, similarity of appearance (Preuschoft & vanHooff, 1995), muscular basis (Burrows et al., 2006), or signal function (e.g., Preuschoft,1995; Waller & Dunbar, 2005). In terms of examining the homology of primate facial displays,we need to describe facial behavior in greater detail, with greater units of analysis and, ideally,make cross-species comparisons using equivalent methodologies. Here we have reported thefirst application of the ChimpFACS, an anatomically based system for classifying facialmovements, to characterize the naturally occurring facial behavior of chimpanzees.
First and foremost, this study validates the ability of the ChimpFACS to classify chimpanzeefacial expressions in a manner similar to those identified by human ethological observation.Second, and perhaps more important from an application approach, the analyses presented heremade it possible to identify unique combinations of AUs that were most highly predictive ofspecific expression categories. This is an extremely useful application because chimpanzeefacial expressions are both highly graded and blended signals, and capturing them at their peakintensity from photographs, or even from still frames of video recordings, is extremely difficult.These unique configurations may then be used to further identify the appropriate set of peakintensity stimuli for use in playback experiments or computerized tasks involving thediscrimination of emotional stimuli by chimpanzees (see Parr, Hopkins, & de Waal, 1998).Moreover, these prototypical AU configurations are so robust that they could be used to identifyexpression categories with >80% accuracy by individuals totally naïve as to the appearance ofthe expressions or the purpose of the classifications.
For our second aim, we have proposed that the ChimpFACS will be extremely useful forunderstanding of the evolution of emotional expressions as it provides the necessary first stepin establishing the structural homology between chimpanzee and human facial expressions,and that it provides a common language for describing their similarities and differences. Wehave presented several suggestive comparisons in Table 3, but we caution readers in theirinterpretation of this table for the following reasons. First, in several of the examples, someAUs are clearly present in one species but not in the other. This is particularly true for upperface AUs that are present in the human expressions but were never part of the prototypicalconfiguration for the chimpanzee expressions. However, the codes listed for the humanexpressions represent the full FACS codes for those specific images and not the statisticallyderived prototypical configurations, as is true for the chimpanzee. Thus, it is unclear what theprototypical configurations would be for human facial expressions, although the FACSInvestigator’s Guide provides some likely combinations for each expression (Ekman et al.,2002a). Second, the human expressions were posed, so some nuances of emotion might belacking. In future studies, researchers will need to develop stimulus sets for human expressionsbased on spontaneously occurring behavior in order to generate prototypical AU configurationsand to provide more accurate comparisons. Finally, to date, only minimal information isavailable concerning the emotional function of chimpanzee expressions, so these comparisonsshould be interpreted only in terms of the structure of the expression, not their meaning orfunction. Future studies will examine these issues in more detail and with more precise
Parr et al. Page 9
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
methodologies, including data on the antecedent conditions that elicit these expressions in bothchimpanzees and humans in order to more rigorously address the issue of homologousemotional expressions.
The results of this study demonstrate the utility of ChimpFACS in distinguishing betweensubtle expressions that have previously been difficult to characterize from the literature. Thealert face is one such example. Parr and colleagues (2005) originally described this expressionas a tense face in their observational study of facial expressions in captive chimpanzees. Thislabel was derived largely from the morphological description of the tense face reported inCatterhine primates by van Hooff (1967), where the lips are stretched exposing the lower teeth.van Hooff (1973), however, further characterized the tense-mouth face in the chimpanzee asa component of the bulging-lip face (van Hooff, 1973). However, the morphology of these twoexpressions appears to be quite distinct: The lips are not open but pressed together in thebulging-lip face. Apart from these references, descriptions of the chimpanzee tense face havepreviously been neglected, and this is likely because it is quite subtle and may often beinterpreted as a neutral face with open lips. Here we have identified a specific combination ofAUs associated with what we now refer to as the alert face, AU16 + 25 + 26 (lower lip depressor,lips parted, lower jaw drop; see Figure 2), thus validating this as a specific facial expression.Parr and colleagues (2005) reported this expression as used most often in the context ofvigilance and agonism and, thus, the alert face appears to be a more appropriate label.
There were some facial expressions that ChimpFACS had difficulty discriminating between,such as pant-hoots and pouts. Only 24% of pouts were correctly identified as belonging to thatcategory. The majority of pouts (over 72%) were identified by the DFA as pant-hoots. This,however, should not be seen as a failure of the ChimpFACS. First, the current ChimpFACSdoes not include measures of intensity of displays. The visual distinction between pouts andpant-hoots in terms of AUs and perceived configuration is minimal, so adding intensity mayprovide a clearer delineating marker for ChimpFACS. Moreover, not only do pouts and pant-hoots look very similar on initial inspection, but they often co-occur in the same behavioralsequences. Parr and colleagues (2005) found that the social contexts that elicited pouts andpant-hoots in a large social colony of chimpanzees were highly correlated. They can be furtherdifferentiated using FACS terminology by adding the AU that codes for a vocalization, forexample, AU50. Pouts can occur with vocalizations, but these are not hoots. They are typicallydescribed as pout-moans (van Hooff, 1973; Parr et al., 2005). Pant-hoots are often used duringbluff displays that have a dynamic movement component and can be followed by a climaxscream. Thus, the future inclusion of intensity ratings, plus adding an AU50 to an auditorypant-hoot, would distinguish it from the pout.
Another interesting finding was the apparent absence in the chimpanzee expressions of somehighly salient and frequent AUs involved in human emotions. We only coded an AU1 + 2combination, the combined inner and outer brow raiser, in two chimpanzee expressions (DFAidentified as a pant-hoot and an alert face) and the AU9, nose wrinkler, in three expressionsthat the DFA all identified as ambiguous. The brow raising denoted by AU1 + 2 is commonlyassociated with human surprise, and the nose wrinkling of AU9 is a main component of disgust.Both of these movements have been reported in the published ChimpFACS (Vick et al.,2007), and the movements have been stimulated using intramuscular electrodes in a studydesigned to validate the appearance changes of these muscles on the chimpanzee face (Walleret al., 2006). Last, we never observed an AU4, or brow lowerer, and this movement is notincluded in the ChimpFACS manual despite being one of the most salient and relevantmovements for human emotional expressions, such as anger, fear, and sadness (Kohler et al.,2004). Two possible explanations may be put forth to explain these absences. The first is thatsome of these movements are difficult to code in still photographs, such as the AU1 + 2.Moreover, most facial movements are generally more difficult to see on the chimpanzee face
Parr et al. Page 10
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
because of morphological differences in the shape of the head, size of features, and theavailability of contrasts in color and textures. The chimpanzees do not have an elongatedforehead and, in combination with their heavy brow ridge and lack of distinguishable hairyeyebrows, the brow-raising movement of AU1 + 2 is extremely difficult to see and most likelyhas little signal function for conspecifics. The nose wrinkle (AU9) has been observed, but moreoften in association with extraneous movements such as when chimpanzees squint during acamera flash or find some food distasteful (Steiner et al., 2001). This perhaps explains whythese movements were more often associated with expressions in the ambiguous category. Thelack of an AU4 in the chimpanzee is very puzzling. One very tentative explanation, as withAU1 + 2, might be that the heavy brow ridge precludes this movement from being very salientin the chimpanzee, and thus, it has not evolved a specific role in expressive communication.Moreover, although the muscles involved in this movement are present in the chimpanzee face,the corrugator, depressor supercilli, and the procerus (Burrows et al., 2006), they may not bestrong enough to contract the skin across this brow ridge. Future studies will use histologicaltechniques to gain insight into the microstructure of these facial muscles and may shed somelight on these issues.
Finally, an unexpected finding was that neutral faces were not really neutral. The analysespresented here suggested that only 40% of the neutral faces were correctly assigned to theneutral category. The ChimpFACS provided a tool for coding subtle movements in those facesthat were not identified on initial inspection and classification. Further analyses are needed toclarify whether faces that contain subtle, individual movements but appear neutral are able tocommunicate meaningful information to conspecifics. For example, in humans, an appraisalview of emotions has led to componential approaches to facial movement, ascribing specificmeaning to the various components of facial expressions (Kaiser & Wehrle, 2004).Applications such as this are essential for the future development of ChimpFACS.
AcknowledgmentsThis study was supported by Grant RR-00165 from the National Institutes of Health/National Center for ResearchResources to the Yerkes National Primate Research Center, Grant NIMH-R01-MH068791 from the National Instituteof Mental Health to Lisa A. Parr, and Grant F/00678/E from the Leverhulme Trust to Kim A. Bard.
ReferencesAndrew RJ. The origin and evolution of the calls and facial expressions of the primates. Behaviour
1963;20:1–109.
Bakeman, R.; Gottman, JM. Observing interaction: An introduction to sequential analysis. 2nd ed.. NewYork: Cambridge University Press; 1986.
Bolwig N. Facial expression in primates with remarks on a parallel development in certain carnivores (apreliminary report on work in progress). Behaviour 1964;22:167–193.
Burrows AM, Waller BM, Parr LA, Bonar CJ. Muscles of facial expression in the chimpanzee (Pantroglodytes): Descriptive, comparative and phylogenetic contexts. Journal of Anatomy 2006;208:153–168. [PubMed: 16441560]
Chevalier-Skolnikoff, S. Facial expression of emotion in nonhuman primates. In: Ekman, P., editor.Darwin and facial expressions. New York: Academic Press; 1973. p. 11-89.
Darwin, C. The expression of the emotions in man and animals. 3rd ed.. New York: Oxford UniversityPress; 1998. (Original work published 1872)
de Waal FBM. The communicative repertoire of captive bonobos (Pan paniscus), compared to that ofchimpanzees. Behaviour 1988;106:183–251.
Ekman, P.; Friesen, WV. Facial action coding system. Palo Alto, CA: Consulting Psychology Press; 1978.
Ekman, P.; Friesen, WV.; Hager, JC. Facial action coding system: The investigator’s guide. Salt LakeCity, UT: Research Nexus; 2002a.
Parr et al. Page 11
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
Ekman, P.; Friesen, WV.; Hager, JC. Facial action coding system: The manual. Salt Lake City, UT:Research Nexus; 2002b.
Fox MW. A comparative study of the development of facial expressions in canids; wolf, coyote, andfoxes. Behaviour 1969;36:4–73.
Goodall, J. The chimpanzees of Gombe: Patterns of behaviour. Cambridge, MA: Belknapp Press; 1986.
Hinde RA, Rowell TE. Communication by postures and facial expressions in the rhesus monkey (Macacamulatta). Proceedings of the Royal Society of London B 1962;138:1–21.
Huber, E. Evolution of facial musculature and facial expression. Oxford, England: Oxford UniversityPress; 1931.
Kaiser, S.; Wehrle, T. Facial expressions in social interactions: Beyond basic emotions. In: Canamero,D.; Aylett, R., editors. Advances in consciousness research: Animating expressive characters forsocial interactions. Amsterdam: John Benjamins; 2004.
Kohler CG, Turner T, Stolar NM, Bilker WB, Brensinger CM, Gur RE, et al. Differences in facialexpressions of four universal emotions. Psychiatry Research 2004;128:235–244. [PubMed:15541780]
Ladygina-Kohts, NN. Infant chimpanzee and human child: A classic 1935 comparative study of apeemotion and intelligence. de Waal, FBM., editor. New York: Oxford University Press; 2002.(Original work published 1935)
Marler, P. Communication in monkeys and apes. In: DeVore, I., editor. Primate behavior. New York:Holt, Rinehart & Winston; 1965. p. 544-584.
Marler, P. Social organization, communication, and graded signals: The chimpanzee and the gorilla. In:Bateson, PPG.; Hinde, RA., editors. Growing points in ethology. London: Cambridge UniversityPress; 1976. p. 239-279.
Marler, P.; Tenaza, R. Signaling behavior of apes with special reference to vocalization. In: Sebeok, T.,editor. How animals communicate. Bloomington: Indiana University Press; 1976. p. 965-1033.
Martinez, AM.; Benavente, R. The AR face database. [CVC Tech. Rep. No. #24.]. Barcelona, Spain:Computer Vision Center; 1998.
Mitani JC, Gros-Louis J, Macedonia JM. Selection for acoustic individuality within the vocal repertoireof wild chimpanzees. International Journal of Primatology 1996;17:569–581.
Oster, H. BabyFACS: Facial action coding system for infants and young children. New York University;2001. Unpublished manual
Parr LA, Cohen M, de Waal FBM. Influence of social context on the use of blended and graded facialdisplays in chimpanzees. International Journal of Primatology 2005;26:73–103.
Parr LA, Hopkins WD, de Waal FBM. The perception of facial expressions in chimpanzees (Pantroglodytes). Evolution of Communication 1998;2:1–23.
Parr, LA.; Waller, BM. The evolution of human emotion. In: Kaas, JA., editor. Evolution of nervoussystems, Vol. 5: Evolution of primate nervous systems. Oxford: Elsevier; 2006. p. 447-472.
Preuschoft, S. Unpublished doctoral dissertation. Utrecht, the Netherlands: University of Utrecht; 1995.“Laughter” and “smiling” in macaques—an evolutionary perspective.
Preuschoft S, van Hooff JARAM. Homologizing primate facial displays: A critical review of methods.Folia Primatologica 1995;65:121–137.
Preuschoft, S.; van Hooff, JARAM. The social function of “smile” and “laughter”: Variations acrossprimate species and societies. In: Segerstrale, U.; Mobias, P., editors. Nonverbal communication:Where nature meets culture. Mahwah, NJ: Erlbaum; 1997. p. 252-281.
Redican, WK. Facial expressions in nonhuman primates. In: Rosenblum, LA., editor. Primate behaviour—Developments in field and laboratory research. London: Academic Press; 1975. p. 103-194.
Rosenberg, E.; Ekman, P. What the face reveals. New York: Cambridge University Press; 1997.
Siebert E, Parr LA. A structural and contextual analysis of chimpanzee screams. Annals of the New YorkAcademy of Sciences 2003;1000:104–109. [PubMed: 14766626]
Slocombe K, Zuberbuhler K. Agonistic screams in wild chimpanzees (Pan troglodytes schweinfurthii)vary as a function of social role. Journal of Comparative Psychology 2005;119:67–77. [PubMed:15740431]
Parr et al. Page 12
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
Steiner JE, Glaser D. Differential behavioural responses to taste stimuli in non-human primates. Journalof Human Evolution 1984;13:709–723.
Steiner JE, Glaser D. Taste-induced facial expressions in apes and humans. Human Evolution1995;10:97–105.
Steiner JE, Glaser D, Hawilo ME, Berridge KC. Comparative expression of hedonic impact: Affectivereactions to taste by human infants and other primates. Neuroscience and Biobehavioural Reviews2001;25:53–74.
Tinbergen N. On aims and methods of ethology. Zeitshrift für Tierpsychologie 1963;20:410–433.
van Hooff JARAM. Facial expressions in higher primates. Symposia of the Zoological Society of London1962;8:97–125.
van Hooff, JARAM. The facial displays of the Catarrhine monkeys and apes. In: Morris, D., editor.Primate ethology. Chicago: Aldine; 1967. p. 7-68.
van Hooff, JARAM. A structural analysis of the social behaviour of a semi-captive group of chimpanzees.In: von Cranach, M.; Vine, I., editors. Expressive movement and non-verbal communication. London:Academic Press; 1973. p. 75-162.
van Lawick-Goodall, J. A preliminary report on expressive movements and communication in the GombeStream chimpanzees. In: Jay, PC., editor. Primates: Studies in adaptation and variability. New York:Holt, Rinehart, and Winston; 1968. p. 313-519.
Vick SJ, Waller BM, Parr LA, Smith Pasqualini MC, Bard KA. A cross species comparison of facialmorphology and movement in humans and chimpanzees using FACS. Journal of NonverbalBehaviour 2007;31:1–20.
Waller BM, Dunbar RIM. Differential behavioural effects of silent bared teeth display and relaxed openmouth display in chimpanzees (Pan troglodytes). Ethology 2005;111:129–142.
Waller BM, Vick SJ, Parr LA, Bard KA, Smith Pasqualini MC, Gothard KM, Fuglevand AJ.Intramuscular electrical stimulation of facial muscles in humans and chimpanzees: Duchennerevisited and extended. Emotion 2006;6:367–382. [PubMed: 16938079]
Weigel RM. The facial expressions of the brown capuchin monkey (Cebus apella). Behaviour1979;68:250–276.
Yerkes, RM. The great apes. Oxford, United Kingdom: Oxford Press; 1943.
Yerkes, RM.; Learned, BW. Chimpanzee intelligence and its vocal expression. Oxford, United Kingdom:Oxford Press; 1925.
Parr et al. Page 13
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
Figure 1.The action units (AUs) of the human and chimpanzee face. White circles correspond toapproximate underlying muscle origins (excepting orbital muscles) and black lines showestimated length/orbit of the muscle. Muscles move toward the origin when contracted (orbitalmuscles reduce aperture of orbit). The chimpanzee photograph was taken by Lisa A. Parr atthe Chester Zoo, Chester Zoo, United Kingdom.
Parr et al. Page 14
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
Figure 2.Examples of the prototypical configuration for chimpanzee facial expressions identified usingthe discriminant function analysis. Photographs are courtesy of the Living Links Center, EmoryUniversity (taken by F.B.M. de Waal or Lisa A. Parr).
Parr et al. Page 15
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
Parr et al. Page 16
Tab
le 1
Per
cent
age
of A
gree
men
t for
Cla
ssifi
catio
ns o
f Chi
mpa
nzee
Fac
ial E
xpre
ssio
ns U
sing
Chi
mpF
AC
S
Exp
ress
ion
type
Exp
ress
ion
type
BT
PF
HO
AM
BN
SCA
LP
OW
H
Bar
ed te
eth
(BT
)70
.717
.10.
02.
40.
09.
80.
00.
00.
0
Pla
y fa
ce (
PF
)0.
087
.10.
06.
50.
06.
50.
00.
00.
0
Pan
t-ho
ot (
HO
)0.
00.
094
.60.
00.
00.
00.
05.
40.
0
Am
bigu
ous
(AM
B)
16.7
8.3
20.8
16.7
8.3
8.3
20.8
0.0
0.0
Neu
tral
(N
)0.
00.
00.
06.
740
.00.
040
.013
.30.
0
Scr
eam
(S
C)
0.0
2.9
0.0
0.0
0.0
97.1
0.0
0.0
0.0
Ale
rt fa
ce (
AL)
5.6
16.7
22.2
0.0
5.6
0.0
50.0
0.0
0.0
Pou
t (P
O)
0.0
0.0
72.4
0.0
0.0
0.0
0.0
24.1
3.4
Whi
mpe
r (W
H)
9.1
0.0
18.2
0.0
9.1
0.0
0.0
0.0
63.6
Not
e. A
tota
l of 6
7.6%
of c
ross
-val
idat
ed g
roup
ed c
ases
wer
e co
rrec
tly c
lass
ified
. Bol
d ty
pe in
dica
tes
the
perc
enta
ge o
f agr
eem
ent b
etw
een
expr
essi
on g
roup
s id
entif
ied
by th
e di
scrim
inat
e fu
nctio
ns a
naly
sis
and
the
pred
icte
d gr
oup
mem
bers
hip.
Chi
mpF
AC
S =
Chi
mpa
nzee
Fac
ial A
ctio
n C
odin
g S
yste
m.
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
Parr et al. Page 17
Tab
le 2
Act
ion
Uni
ts A
ssoc
iate
d W
ith E
xpre
ssio
ns in
Eac
h P
redi
cted
Cat
egor
y, a
s D
eter
min
ed b
y th
e D
iscr
imin
ant F
unct
ions
Ana
lysi
s
Act
ion
unit
s
Exp
ress
ion
n1
+ 2
69
1012
1617
1922
2425
2627
28
Bar
ed-t
eeth
340
70
2534
190
00
034
60
0
Pla
y fa
ce39
00
00
398
01
00
3925
120
Pan
t-ho
ot81
10
00
02
00
810
8111
40
Am
bigu
ous
90
13
24
12
14
26
11
0
Neu
tral
90
00
00
02
00
10
10
0
Scr
eam
410
20
4141
310
00
039
041
0
Ale
rt fa
ce24
10
01
15
00
00
245
30
Pou
t14
00
00
00
70
140
50
00
Whi
mpe
r8
06
01
82
00
80
80
00
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
Parr et al. Page 18
Tab
le 3
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
Parr et al. Page 19E
quiv
alen
t hu
man
fac
ial e
xpre
ssio
n?
Pro
toty
peA
Us
Fac
ial e
xpre
ssio
nP
roto
type
AU
s
Bul
ging
-lip
disp
lay
AU
17A
U24
Ang
ry fa
ceA
U4
AU
5A
U7
AU
17A
U24
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
Parr et al. Page 20
Equ
ival
ent
hum
an f
acia
l exp
ress
ion?
Pro
toty
peA
Us
Fac
ial e
xpre
ssio
nP
roto
type
AU
s
Sile
nt b
ared
-tee
thdi
spla
yA
U10
AU
12A
U16
AU
25
Sm
ileA
U6
AU
12A
U25
Emotion. Author manuscript; available in PMC 2010 February 22.
NIH
-PA
Author M
anuscriptN
IH-P
A A
uthor Manuscript
NIH
-PA
Author M
anuscript
Parr et al. Page 21
Equ
ival
ent
hum
an f
acia
l exp
ress
ion?
Pro
toty
peA
Us
Fac
ial e
xpre
ssio
nP
roto
type
AU
s
Scr
eam
AU
10A
U12
AU
16A
U25
AU
27
Scr
eam
AU
4A
U6
AU
7A
U10
AU
12A
U16
AU
25A
U27
Pan
t-ho
otA
U22
AU
25A
U26
?
Rel
axed
ope
n m
outh
AU
12A
U25
AU
26
Laug
hter
AU
6A
U12
AU
25A
U26
Pri
mat
e N
ervo
us S
yste
ms,
200
6. C
opyr
ight
200
6 by
Els
evie
r P
ress
. Ada
pted
with
per
mis
sion
. Act
ion
units
(A
Us)
in b
old
type
are
foun
d in
bot
h th
e ch
impa
nzee
dis
play
and
the
prop
osed
hum
an e
quiv
alen
t. F
AC
S =
Fac
ial A
ctio
n C
odin
g S
yste
m.
Emotion. Author manuscript; available in PMC 2010 February 22.
This article was downloaded by: [University of Colorado at BoulderLibraries]On: 06 August 2012, At: 12:32Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number:1072954 Registered office: Mortimer House, 37-41 Mortimer Street,London W1T 3J H, UK
J ournal of Applied AnimalWelfare SciencePublication details, including instructionsfor authors and subscription information:http:/ / www.tandfonline.com/ loi/ haaw20
Chimpanzees andRetirementCarole Noon
Version of record first published: 04 J un2010
To cite this article: Carole Noon (1999): Chimpanzees and Retirement,J ournal of Applied Animal Welfare Science, 2:2, 141-146
To link to this article: http:/ / dx.doi.org/ 10.1207/ s15327604jaws0202_6
PLEASE SCROLL DOWN FOR ARTICLE
Full terms and conditions of use: http: / /www.tandfonline.com/page/terms-and-conditions
This article may be used for research, teaching, and private studypurposes. Any substantial or systematic reproduction, redistribution,reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden.
The publisher does not give any warranty express or implied or makeany representation that the contents will be complete or accurateor up to date. The accuracy of any instructions, formulae, and drug
doses should be independently verified with primary sources. Thepublisher shall not be liable for any loss, actions, claims, proceedings,demand, or costs or damages whatsoever or howsoever causedarising directly or indirectly in connection with or arising out of theuse of this material.
Dow
nloa
ded
by [U
nive
rsity
of C
olor
ado
at B
ould
er L
ibra
ries]
at 1
2:32
06
Aug
ust 2
012
Dow
nloa
ded
by [U
nive
rsity
of C
olor
ado
at B
ould
er L
ibra
ries]
at 1
2:32
06
Aug
ust 2
012
Dow
nloa
ded
by [U
nive
rsity
of C
olor
ado
at B
ould
er L
ibra
ries]
at 1
2:32
06
Aug
ust 2
012
Dow
nloa
ded
by [U
nive
rsity
of C
olor
ado
at B
ould
er L
ibra
ries]
at 1
2:32
06
Aug
ust 2
012
Dow
nloa
ded
by [U
nive
rsity
of C
olor
ado
at B
ould
er L
ibra
ries]
at 1
2:32
06
Aug
ust 2
012
Dow
nloa
ded
by [U
nive
rsity
of C
olor
ado
at B
ould
er L
ibra
ries]
at 1
2:32
06
Aug
ust 2
012