character emotion experience in virtual environments

6

Click here to load reader

Upload: nelson-zagalo

Post on 11-Jul-2016

217 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Character emotion experience in virtual environments

Vis Comput (2008) 24: 981–986DOI 10.1007/s00371-008-0272-6

O R I G I NA L A RT I C L E

Character emotion experience in virtual environments

Nelson Zagalo · Ana Torres

Published online: 26 August 2008© Springer-Verlag 2008

Abstract The present paper presents an emotion modulefrom an authoring tool of interactive storytelling being de-veloped within the European Project—INSCAPE. The At-mosphere Editor (AE) is an INSCAPE software plug-in. Itsaim is to help authors to easily create virtual interactivescenes that are recognized as emotional in order to con-tribute to higher coherence of their content and simultane-ously to emphasize their communication purposes. It worksthrough the attribution of emotional meaning to virtual envi-ronments and characters classes that act on the virtual story-world. Therefore, it is designed to produce a semantic inter-vention in the story but does not intend to transcend the sto-ryteller work. AE presents then a taxonomy capable of sus-taining the communicational optimization of the interactivenarratives at an emotional level. The AE intervention devel-ops in addition a possible pedagogical virtue permitting thelearning by the story authors about potential emotional usesof specific virtual parameters. It permits also the INSCAPEuser to understand the emotional semantics canons of theinteractive virtual stories.

Keywords Authoring tools · Emotion · Virtualenvironments · Virtual storytelling

N. Zagalo (�)Department of Communication Sciences, University of Minho,4710-057 Braga, Portugale-mail: [email protected]: http://www.cecs.uminho.pt/

A. TorresDepartment of Communication and Art, University of Aveiro,3810-193 Aveiro, Portugale-mail: [email protected]: http://www.ca.ua.pt

1 Introduction

The INSCAPE tool aims at enabling ordinary people to useand master the latest Information Society Technologies forconceiving, authoring, publishing and experiencing interac-tive stories. To accomplish these goals, INSCAPE is madeof a central framework and a suite of applications (plug-ins)which provide the necessary authoring innovations.

The plug-in we present here has been developed in twosubsequent phases. The first phase studied the expression, orexpressivity of emotion and the second phase—the experi-ence of emotion. Section 2 presents the development madefor the expression of emotion in virtual environments. Sec-tion 3 debates a cognitive model for characters / their expe-riencing inside the virtual world. So for the expression wepresent a table of correspondences we found between filmand virtual reality in terms of expressivity. As for the expe-rience phase, we determined an emotional cognitive struc-ture for the characters and a behavior model that permits thevirtual characters to have some autonomy in the world, butabove all to have relations with the virtual environment reg-ulated by the preset emotional parameters.

2 The plug-in

The AE is a controller module that uses other software mod-ules from INSCAPE to produce results. AE core function isto orchestrate all these modules in order to improve emotion-ality in virtual world scenes, characters and interactivity. Fora successful integration of AE, we have generated a table ofcompatibilities between our researched parameters and theclasses and methods of the INSCAPE system. As we cansee in the interface below (Fig. 1), the AE acts upon the en-vironments and characters through sliders, mixing different

Page 2: Character emotion experience in virtual environments

982 N. Zagalo, A. Torres

Fig. 1 The interface of the“Atmosphere Editor”

Fig. 2 The implementationprocess in the storytelling

levels of expressivity. Each slider is calculated according tothe researched parameters for the story. In terms of how itworks, the Atmosphere Editor (AE) must be seen as a mod-ule that affects the story but that works hierarchically belowall the other INSCAPE editors. Thus, authors can changeany transformation done by AE. This gives users the pos-sibility to use the AE to affect an overall scene, and thenmake little changes on the transformations operated by AE.Thus, using a direct interface, the user is completely free tochange the world and characters as he likes and in a muchmore straightforward way. Using the slider to add the per-centage he wants, seeing 100% effect and deciding on thefly in real time to reduce it to only 25% of the effects ifrequired. In addition, the user can consider mixing variousemotional categories to attain the expressivity s/he prefersfor the ideal scene s/he is building.

In terms of storytelling we can see in the Fig. 2 that theplug-in AE is a software module designed to act on form,and most specifically on story, stylistics. This is a conscious

choice in order to avoid the problem of entering the con-text and thematic domains, and also to avoid interfering toomuch with the Author core information message. With thisstorytelling approach, the author continues to be responsiblefor producing the idea, developing it, and then choosing theelements and interactions accordingly to his/her needs.

Thus this leaves us with three variables to be effected byAE—Environments, Characters and Interactivity—withoutcausing direct interference in the author’s work. In terms ofCharacters we will have two variables—Expression and Ex-perience.

emotion experience is the subjective-feeling state con-sciously perceived by the individual. . . emotion ex-pression means the physiological or behavioral re-sponse to an emotion stimulus [7:9]

Expression represents such physiological and behavioral re-sponses as: body, eyes, hair, movement, etc. Experience rep-resents cognitive appraisals of the world, and needs (an ob-

Page 3: Character emotion experience in virtual environments

Character emotion experience in virtual environments 983

ject; cognitive appraisals, and goals), and this defines thepath we traced to enter into the domain of autonomic char-acters which we present in this paper. In the tables belowwe can see traditional storytelling classes researched accord-ingly to their emotional impact in viewers. These classeswere then “remediated” [1] and implemented into virtual en-vironments.

Most of these parameters are related to the expression ofemotion; however, in terms of parameters for characters—our main concern in this paper—we can see two very im-portant classes for the experiencing in the table: Touch andPersonality.

Touch was since the beginning of our studies consid-ered one of the most important classes to be studied, mainlybecause we find out that the touch between characters invisual stories could be translated into ranges of emotionsnot being stimulated by the majority of interactive story-telling artefacts [11]. Body virtual touching is a way ofsensitive comfort that the character can look for and con-sequently which the user can feel by somatic simulation.Moreover, it can help the user with the continuity of the in-active negative range of emotions. The body touching canbe defined through different behaviors, which express affec-tion on body language (e.g. to give pats on the shoulders orhands, to slip the hands in the hair or on body, to hug, tokiss, to sit or to lie down, to lean) and we can see the imple-mentation of the virtual body touch in Fig. 3.

The importance of body touching has been well high-lighted since Harlow’s study [4], in which he concludedthat the comfort of body touching is determinant for the at-tachment creation. The fact that attachment propitiates sen-sations of security is already known and therefore the ex-istence of body touching at this phase is crucial. For theclass of personality we have used in our studies the “big fivemodel” to trace personalities of characters because person-ality can influence the emotional response process.

3 The cognitive model

The interest in experiential models of emotion and emo-tional behavior has been steadily growing, mainly in the‘agent’ research community [2, 5, 8]. The development ofcomputational models of emotion facilitates advances ina large array of systems which model, interpret or influ-ence human behavior (thorough states-of-the-art on emotioncomputational models can be found at the HUMAINE—Human–Machine Interaction Network on Emotion—ECNetwork of Excellence FP6-IST-2002-507422—http://emotion-research.net). The characteristic system interpretsthe world and interactions accordingly to the emotion modelone has and the phase or mood being used at that moment,like a human being facing a challenge will react in accor-dance with his/her own personality plus the mood of that

moment. For instance, Mao and Gratch [6] research concen-trates on creating a model of an underlying process, and in-ferences of social causality are used to enrich the cognitiveand social functionality of intelligent agents. On the otherway, Samyn [8] model is based on the maintenance of thecharacters in a pervasive world, researching a decision sys-tem that can autonomically establish goals and perform dra-matic actions within virtual worlds. Such models can helpvirtual agents to explain the observed social behavior of oth-ers, which is crucial for successful interactions among socialentities.

Starting from Frijda’s [3] cognitive appraisal theories,mainly his “emotion process” (see the diagram on p. 454),we developed an emotional model to regulate character be-haviors, which translates into a decision system for actionreadiness and action determination. According to Frijda, theemotion process has a core system which begins with a stim-ulus and ends with an answer. The core system is regulatedby seven phases. The first phase is an analyzer that codi-fies (categorizes) events. In the following phase the eventis evaluated according to its relevancy/irrelevancy to elicitpleasure, pain, surprise, or desire. In Frijda’s theory emo-tions are held to arise when events are appraised as rele-vant to concerns. Concern is defined “a disposition to desireoccurrence or non-occurrence of a given kind of situation”[3:335]. It includes motives, major goals, personal sensitivi-ties, attachments, and personal supra values. The third phaseis a diagnostician that evaluates what can be made to copewith the event. The fourth is an evaluation system of theevent strength, difficulty and seriousness. This phase is re-sponsible to emit control sign of the results. The fifth phasedetermines the action plan. The phase which generates thephysiological change follows. The last phase, the seventh,determines the action to emit. Frijda also argues that thereare other factors responsible for the emotional response de-termination. These collateral factors are the humor, the acti-vation state, the precedent experiences and the other people.It is also important to note that this model has a continu-ing feedback. Each emotional response is influenced by theprecedent ones. This is the process generator of sentiments.

As we can see in Fig. 4, the model we propose is simplerthan the Frijda’s emotional process. We can see that someof Frijda’s phases are compacted into a single phase—a fil-ter in our model. For example, our filter called “Event cod-ing” has the functions of the first, second and fourth Frijda’sphases, because it is responsible for all the event evaluation.Our “Resource evaluator” is equivalent to the third and fifthphases of Frijda’s model, because it is in charge of evalu-ating what can be made to cope with the event and of theaction planning accordingly to the act. The last phase of theFrijda’s model is present in all our filters because all of themwill contribute to the action/response generation. We takeinto account the collateral factors of Frijda’s, too. We have

Page 4: Character emotion experience in virtual environments

984 N. Zagalo, A. Torres

Table 1 Classes for theimplementation of environmentsand characters [10]

Environments Characters

Camera Character’s space

Lenses, motion, position Intimate, personal, social, public

Editing Physical features

Cuts and pace Clothes, skin, hair, weight, height

Time Body movement

Continuity and variation Posture, gestures

Frame Facial expression

Composition and shape Face and eyes

Screen direction Touch

3 axes (up-down; left-right; back-front) Types

Music/sound qualities Vocal aspects

Intensity, pitch, rhythm, speed, shape Tone, types

Lighting Apparent personality

Motivation, contrast, tone Extraversion, agreeableness, consciousness,

neuroticism, openness

Color

Hue, brightness, saturation

Design effects

Visual and aural

Fig. 3 Touchability, interactionbetween characters (top left:Sad; top right: Happy; bottomleft: Relax; bottom right:Tension)

a filter to evaluate the activation state, part of the emotionalresponse selection. The precedent experiences are taken intoaccount in the feedback system and the other people in theconversion of the others’ responses in a new stimulus. We

added another factor: the “Stimulus duration.” The durationof a stimulus will influence the answer to be emitted by thecharacter. If the character has a long time to cope with agiven situation, s/he will choose a more adequate answer

Page 5: Character emotion experience in virtual environments

Character emotion experience in virtual environments 985

Fig. 4 Emotion charactersmodel, grounded in Frijda’smodel [3]

than if s/he does not. At the same time if the exposition to agiven negative stimulus is longer, the character will answerit less accurately because its negative impact will be greatertoo.

The way the process of regulation works is defined asfollows: for each stimulus event produced in the virtual en-vironment, characters will emotionally experience the eventin four procedures before responding to the environmentor characters. The four procedures are filters of behavioralanswers. Several potential answers are selected by filters.A stimulus X would not always correspond to the same an-swer Y.

So, the first filter is related to the coding of the event.A catalogue of events is present in the model that canrecognize the stimulus and attribute to it a label (e.g.relevant/irrelevant; desirable/undesirable; easy/hard; pleas-ant/unpleasant; kind/unkind). The second filter correspondsto the reading of the emotion stated by the story author inthe plug-in, and will force response to stimulus as a person’smood contaminates responses in everyday life [9]. There-fore, the activation degree of the character will determine ifit is able to cope with the situation or not in the same man-ner as happens to humans, i.e., if the activation degree ishigh the character is more able to cope with the stimulus.This author also found that the responses of human beingare more adequate as the stress degree is lower. As a resultthe capacity to cope is also improved by the positive valenceof the emotional state. The third filter is the evaluator of ex-ternal (environment elements) and internal (activated states)resources to cope with the event. This evaluator will deter-mine which of the answers selected before will promote theaction. This phase is a balance phase, which reviews the lastphases to obtain the most probable answer. The fourth filteris related to the duration of the event exposition. It will helpto determine the answer: if the time to cope with the stim-ulus is low, the character cannot operate the most adequateanswer but the possible one, and vice versa. Finally, there isa “learning” situation, which means that the effects to eachresponse performed will be memorized in order to be usedin the following situations, as we can see in the diagram:the dashed line. This consists of a feedback mechanism sim-ilar to Frijda’s, as described above. This mechanism is re-

sponsible for the update of all the system in the way that itevaluates the consequences of our answers. The positive ornegative consequences of the character’s answers will teachthe character system which answers are the most adequateto a particular kind of stimulus (the answers with positiveconsequences are reinforced and the negative ones are dis-couraged). This mechanism will help characters to improvetheir emotional behavioral adequacy through their lifespanin the virtual world.

4 Conclusions

As we have explained in this paper, we have implementedthe first phase of a software that can help virtual storytellingauthors in quick expressive transformations of environmentsand characters. We then tried to demonstrate the theoreticalmodel we are designing, to be implemented accordingly tothe existent psychological theories. Our main interest is thecreation of a model or architecture of emotions that can helpautonomous characters in coping process with the environ-ment and the other characters. Principally we are worriedabout developing a system that can use the expressivity ofthe touch between characters to answer emotionally and soimprove the spectrum of emotions elicited by virtual story-telling artefacts. Therefore, it is not in our interest to de-velop a very realistic or life-simulating model. In this plug-in our main goal is to implement an emotion system thatcan use the best techniques developed by traditional story-telling artefacts that make use of the dramatic techniques toelicit emotion. We believe that even if virtual world can bemore extensive or pervasive than any traditional story en-vironment, it still is not being a physical reality. Thus weshould not worry about simulating reality but only imitatingit, as Aristotle said.

Acknowledgements This research is performed in the frame of theINSCAPE Integrated Project (EU RTD contract IST-2004-004150),which is funded by the European Commission under the sixth Frame-work Programme. More information on the project is available atwww.inscapers.com.

Page 6: Character emotion experience in virtual environments

986 N. Zagalo, A. Torres

References

1. Bolter, J.D., Grusin, R.A.: Remediation: Understanding New Me-dia. MIT Press, Cambridge (1999)

2. Eladhari, M., Lindley, C.: Player character design facilitating emo-tional depth in MMORPGs. In: Digital Games Research Con-ference 2003, 4–6 November 2003. University of Utrecht, TheNetherlands (2003)

3. Frijda, N.H.: The Emotions. Cambridge University Press, Cam-bridge (1986)

4. Harlow, H.F.: The nature of love. Am. Psychol. 13, 573–685 (1958). Acedido via http://psychclassics.yorku.ca/Harlow/love.htm

5. Klesen, M.: Using theatrical concepts for role-plays with educa-tional agents. Applied Artificial Intelligence Special Issue Educa-tional Agents—Beyond Virtual Tutors (2005)

6. Mao, W., Gratch, J.: Social causality and responsibility: modelingand evaluation, In: International Conference on Interactive VirtualAgents, Kos, Greece (2005)

7. Plantinga, C.: Introduction. In: Plantinga, C.R., Smith, G.M. (eds.)Passionate Views: Film, Cognition, and Emotion, Johns HopkinsUniversity Press, Baltimore (1999)

8. Samyn, M.: Drama princess project, In: Tales-of-Tales. http://www.tale-of-tales.com/DramaPrincess (2006)

9. Sanders, A.F.: Towards a model of stress and human performance.Acta Psychol. 53, 61–97 (1983)

10. Zagalo, N.: Convergência entre o Cinema e a Realidade Vir-tual. PhD Thesis, Departamento de Comunicação e Arte, Univer-sidade de Aveiro, Portugal (2007). http://biblioteca.sinbad.ua.pt/teses/2008001260

11. Zagalo, N., Torres, A., Branco, V.: Passive interactivity, an an-swer to interactive emotion. In: 5th International Conference onEntertainment Computing. Lecture Notes in Computer Science,vol. 4161. Springer, New York (2006). ISBN: 3-540-45259-1

Nelson Zagalo is an assistant pro-fessor currently working at the Uni-versity of Minho, Portugal. HisPhD, “Convergence between Cin-ema and Virtual Reality”, presentednew interaction paradigms for af-fective storytelling communicationin virtual environments. He hasmore than twenty peer-reviewedpublications in the fields of film,videogames, interactive storytellingand emotion. During the last threeyears he was member of the Scien-tific Board of the INSCAPE project(FP6, IP, IST-2004-04150) and re-

sponsible for the developing of emotion semantic layers to help easingthe authoring of virtual environments. He now teaches courses on film,digital storytelling, digital games and digital arts and researches newcommunication models for interactive media as interactive advertisingand virtual worlds.

Ana Torres is a psychologist re-searcher at the Research Unit ofCommunication and Arts of theUniversity of Aveiro. She is devel-oping emotional meaningful digitalenvironments and characters. She isalso finishing her master degree onPsychiatry and Mental Health fieldat Medical Sciences Faculty of theUniversity of Oporto. Her masterdegree efforts are related with theresearch of the Cognitive stimula-tion capacities of Videogames in El-derly People.