thestudyofafive-dimensionalemotionalmodelforfacial ... ·...

10
Research Article The Study of a Five-Dimensional Emotional Model for Facial Emotion Recognition Hanzhong Zhang, Jibin Yin , and Xiangliang Zhang Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650221, China CorrespondenceshouldbeaddressedtoJibinYin;[email protected] Received 25 April 2020; Revised 6 October 2020; Accepted 18 December 2020; Published 28 December 2020 AcademicEditor:FrancescoGringoli Copyright © 2020 Hanzhong Zhang et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Based on basic emotion theory and the PAD emotion model that can describe continuous emotion changes, we first propose a more general concept of a five-dimensional emotion model to better meet the needs in the area of emotion recognition. We determined the relationship between its dimensions and basic emotions and used a Pearson correlation analysis, multilayer perceptron,andothermethodstocompareandverifyitwithvolunteerhumanidentifiers.eresultsdemonstratedthatthefive- dimensionalemotionmodelwasbetterthanhumanidentificationinthefieldofemotionrecognition.Wealsocompareditwith thePADemotionmodel.eresultsdemonstratedthatthefive-dimensionalemotionmodelperformedbetter.Finally,usingthe proposed model, we designed a technology prototype of a mood adaptive interface to demonstrate its potential application. 1. Introduction edevelopmentofartificialintelligencehasreachedavery highlevelintheareasofreasoning,calculation,recognition, and learning. However, it only simulates human thinking and rarely pays attention to people’s actual mental activity. eresearchonartificialintelligencecombinedwithhuman psychologicalchangesinthefieldofemotionrecognitionis only just beginning. Incurrentresearch,differentpsychologicalschoolshave different understandings of emotion. e psychological understandingofemotioncanberoughlydividedintobasic emotion theories and dimensional emotion theories. Functional psychology believes that mental activity is a continuous whole. It opposes the splitting of a person’s mentalactivityintoparts[1].Dimensionalemotiontheories are more suitable for expressing a continuous change of emotion. In the study of dimensional emotion theories over the past decades, psychologists have proposed various models. e PAD emotion model (or simply PAD, from its three dimensions)describesemotionsintermsofacompositionof pleasure–displeasure, arousal–nonarousal, and dominance–submissiveness [2–4]. Frijda proposed that emotions are a mixture of pleasant–unpleasant, excited, interested, social evaluation, surprise, and simple–complex [5]. Izard proposed that emotions are composed of a pleasant, tension, impulse, and confidence dimension [6]. ese emotion models have been applied in various situa- tions and contexts, but the existing emotion models do not necessarily work best in emerging fields. In the area of emotion recognition, many computer science studies that use existing emotion models could be improved upon in terms of their results in order to be applicable in various environments. In current AI research using existing emotion models, Wang used the OOC model to composite the cognitive componentsoftheagent’semotions[7];HanusedtheAVS emotion model to design the cognitive-emotional model of an eldercare robot [8]; Mi used the arousal-valence two- dimensional emotion space to explore the academic emo- tions of secondary school students [9]. However, there is not much research in the design of emotionmodelsadaptedtothefieldofartificialintelligence. Shangfei Wang believed that the name of the emotion di- mension was unimportant, and the use of dimension Hindawi Mobile Information Systems Volume 2020, Article ID 8860608, 10 pages https://doi.org/10.1155/2020/8860608

Upload: others

Post on 09-Mar-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: TheStudyofaFive-DimensionalEmotionalModelforFacial ... · pictureasagroupandthereforehad980groupsofdata.We dividedthisdatainto880groupsfortraining,andthe remaining100groupswereusedastestdata

Research ArticleThe Study of a Five-Dimensional Emotional Model for FacialEmotion Recognition

Hanzhong Zhang Jibin Yin and Xiangliang Zhang

Faculty of Information Engineering and Automation Kunming University of Science and Technology Kunming 650221 China

Correspondence should be addressed to Jibin Yin yjblovelhaliyuncom

Received 25 April 2020 Revised 6 October 2020 Accepted 18 December 2020 Published 28 December 2020

Academic Editor Francesco Gringoli

Copyright copy 2020 Hanzhong Zhang et al -is is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work isproperly cited

Based on basic emotion theory and the PAD emotion model that can describe continuous emotion changes we first propose amore general concept of a five-dimensional emotion model to better meet the needs in the area of emotion recognition Wedetermined the relationship between its dimensions and basic emotions and used a Pearson correlation analysis multilayerperceptron and other methods to compare and verify it with volunteer human identifiers -e results demonstrated that the five-dimensional emotion model was better than human identification in the field of emotion recognition We also compared it withthe PAD emotion model -e results demonstrated that the five-dimensional emotion model performed better Finally using theproposed model we designed a technology prototype of a mood adaptive interface to demonstrate its potential application

1 Introduction

-e development of artificial intelligence has reached a veryhigh level in the areas of reasoning calculation recognitionand learning However it only simulates human thinkingand rarely pays attention to peoplersquos actual mental activity-e research on artificial intelligence combined with humanpsychological changes in the field of emotion recognition isonly just beginning

In current research different psychological schools havedifferent understandings of emotion -e psychologicalunderstanding of emotion can be roughly divided into basicemotion theories and dimensional emotion theoriesFunctional psychology believes that mental activity is acontinuous whole It opposes the splitting of a personrsquosmental activity into parts [1] Dimensional emotion theoriesare more suitable for expressing a continuous change ofemotion

In the study of dimensional emotion theories over thepast decades psychologists have proposed various models-e PAD emotion model (or simply PAD from its threedimensions) describes emotions in terms of a composition ofpleasurendashdispleasure arousalndashnonarousal and

dominancendashsubmissiveness [2ndash4] Frijda proposed thatemotions are a mixture of pleasantndashunpleasant excitedinterested social evaluation surprise and simplendashcomplex[5] Izard proposed that emotions are composed of apleasant tension impulse and confidence dimension [6]-ese emotion models have been applied in various situa-tions and contexts but the existing emotion models do notnecessarily work best in emerging fields In the area ofemotion recognition many computer science studies thatuse existing emotion models could be improved upon interms of their results in order to be applicable in variousenvironments

In current AI research using existing emotion modelsWang used the OOC model to composite the cognitivecomponents of the agentrsquos emotions [7] Han used the AVSemotion model to design the cognitive-emotional model ofan eldercare robot [8] Mi used the arousal-valence two-dimensional emotion space to explore the academic emo-tions of secondary school students [9]

However there is not much research in the design ofemotion models adapted to the field of artificial intelligenceShangfei Wang believed that the name of the emotion di-mension was unimportant and the use of dimension

HindawiMobile Information SystemsVolume 2020 Article ID 8860608 10 pageshttpsdoiorg10115520208860608

implied that an emotion space can be established usingappropriate methods in which each emotion can be seen as avector of that space [10] Fu et al [11] and Cai [12] used aneural network to directly train data features to obtain amodel On the basis of a series of experimental studies[13ndash15] Gable and Harmon-Jones first directly tested thehypothesis that motivation-related factors of emotion mayaffect cognitive processing such as attention and memoryand proposed the motivational dimensional model of affect[16] Many of these studies have not been adapted well to thefield of emotion recognition and this is why we propose thefive-dimensional emotion model

2 Five-Dimensional Emotion Model

In view of the shortcomings of those existing models for ourpurposes we deduced and defined five dimensions related toemotion and then built a five-dimensional emotion modeland evaluated it

21 Determine Dimensions In 1971 Ekman developed theFacial Affect Scoring Technique (FAST) [17] which is anemotional measurement system based on facial musclemovement components According to this system he listedsix basic emotions happiness anger fear sadness surpriseand disgust On the premise that this dimensional emotiontheory provided an objective truth there must be differentemotional dimension values that could be assigned to dif-ferent basic emotions Accordingly we established a methodin reverse which determines basic emotions from inferreddimensions

Because ldquodisgustrdquo can be understood as an evaluation ofexternal objects or stimuli which was not conducive todetermine the dimensions it was temporarily excluded inthe derivation Based on two existing emotion models wefirst assumed two known dimensions intensity and pleasureIntensity is used to describe the degree of individual neuralactivation pleasure is used to distinguish the positive andnegative aspects of emotions -ese dimensions are internalexperiences belonging to an individualrsquos own feelings-osefeelings can be divided into internal dimensions Intensity isnot a determinant of emotion it is only used to describe thestrength of an emotion Pleasure distinguishes happinessfrom the other five basic emotions and makes happinessexist as a positive emotion

In 1985 Smith and Ellsworth determined six dimensionsof emotion through experimentation certainty pleasureattention activity control expected effort and sense of re-sponsibility-ey then proposed a cognitive evaluation theoryof emotion [18] According to this theory the core dimen-sions that distinguish anger from other negative emotions arecertainty control and responsibility Anger is an emotionthat often occurs when something negative happens to theindividual and others were responsible andor have controlAlso the individual has a fair sense of certainty about theevent Fear is an emotion that appears often when the en-vironment controls negative events With it individuals havea high sense of uncertainty for the negative objective events

In Kalat and Shiotarsquos study of emotion they pointed out thatunder the same conditions the difference between anger andsadness is whether people can control the situation [19] Sowe can determine two dimensions to distinguish negativeemotions control and certainty

Surprise and fear are similar in a large number of casesIn a 1985 study Meng [20] determined the facial expressionpatterns of infants -ey asked strangers to hold rats in frontof infants to cause fear and presented rats that ran freely in abox to cause surprise in the infants -ey obtained thepredicted correct results in this experiment It can be seenfrom this comparison that the key variable determining fearor surprise was whether the infants increased their responseintensity to the external stimuli namely tension From thiswe can determine the dimension that distinguishes surprisefrom fear tension

Because the three dimensions described above are re-lated to cognition and they lead to emotional differences wecan categorize them as external dimensions

To reduce the potential error caused by the failure ofindividuals to analyze emotions well we employed 65 peopleto evaluate the relationships between the basic emotions andthe five dimensions at six levels of very high (100) high (75)medium (50) low (25) very low (0) and no correlation(empty) In addition to the given level some people thinkthat the difference between these levels is too large and thusfill in other values such as 30 -is was not expected to havean impact on the experimental results

In order to reduce the error we deleted outliers (definedas data with very large deviations) Based on this we thoughtthat a relationship with too high a standard deviation andtoo large a dispersion was irrelevant for our use becausedifferent individuals were expressing different feelings -estandard deviation between the five basic emotions and thefive dimensions is shown in Table 1

Finally we calculated the average of the remaining dataand determined the corresponding level relationships be-tween the five dimensions and the basic emotions -erelationship is shown in Table 2

22VerifyDimensions Izard points out that emotion playsa key role in adaptation and survival Emotions haveadaptability and mobility -e evidence of this conclusionis that in a study of comparative anatomy and compar-ative psychology it was confirmed that facial activitiesplay an important role in the lives of some species ofvertebrates this emphasizes the role of facial expression inemotion research [21] On this basis Meng Zhaolanproposed that facial expression can be used as basic meansof determining emotions in research [22]

-erefore we can verify the five dimensions of ourmodel through the study of facial motion or expression

Ekman and other researchers built the Facial ActionCoding System (FACS) in the mid-1970s by studying theirown facial expressions [23] -is system is not based onmuscles but on facial activity namely an activity unit (AU)Each AU is associated with one or more muscles -e AUswe used are shown in Table 3

2 Mobile Information Systems

An individual can generally judge anotherrsquos emotions bya quick scan of their expressions In this case the visualimage of the face is very hazy and many AUs cannot berecognized clearly -is would suggest that there is no needfor so many AUs in judging emotions So we could simplifythe table of active units and merged the 24 AUs into fivesimple AUs the movement of the eyebrows up and downthe distance between the eyebrows the closure of the eyesthe curvature of the mouth corners and the closure of themouth

By using the Karolinska Directed Emotional Faces(KDEF) [24] a dataset of facial expressions and corre-sponding emotions provided by the Karolinska Institute to

analyze the simple AUs we could verify the accuracy of ourmodel

Using the Face++ engine we extracted 20 feature pointsfrom the KDEF expressions (see Figure 1)

-en we calculated the coordinates of these featurepoints in order to get the quantized values of each sampleAU

For the closure of the mouth the following equationswere used

mouth reference PrimeDistance from top of lower lip to chinPrime

mouth height PrimeDistance from top of lower lip to top of upper lipPrime

mouth open mouth height

mouth referencelowast 100

(1)

-e reason for introducing the mouth_reference here isthat it can eliminate the influence of the camerarsquos distanceand proximity on the coordinate value by finding a referencesystem in the face-e other similar operations are the same

For the curvature of the mouth corners the followingequations were used

lip up PrimeY minus axis position of upper lipPrime

lip down PrimeY minus axis position of lower lipPrime

lip center PrimeAverage number of y minus axis positions of the corners of themouth on both sidesPrime

lip up0 |lip up minus lip center|

lip down0 |lip down minus lip center|

mouth rad lip down0

lip up0 + lip down0lowast 100

(2)

Table 1 -e standard deviation between the five basic emotionsand the five dimensions

Intensity Pleasure Control Certainty TensionHappiness 6288 8528 6102 5225 3176Anger 7103 2977 6008 5422 5827Sadness 5953 2820 4592 5546 4538Fear 6798 2601 407 4204 7184Surprise 6059 3938 5237 4909 3735

Table 2 -e level relationship between the five basic emotions andthe five dimensions

Intensity Pleasure Control Certainty TensionHappiness -uarr uarr -uarr - -darrAnger uarr darr -uarr - -uarrSadness -uarr darr - - -Fear -uarr darr -darr -darr uarrSurprise -uarr -darr - - -darruarr stands for a very high value darr stands for a very low value -darr stands for alow value -uarr stands for a high value and - stands for a medium value emptyfor irrelevant

Mobile Information Systems 3

For the closure of the eyes the following equations wereused

left eye width PrimeLeft eye width

Prime

left eye height PrimeDistance from the left eyelid buttom to the top

left eye open left eye height

left eye width

right eye width PrimeRight eye widthPrime

right eye height PrimeDistance from the right eyelid buttom to the topPrime

right eye open right eye height

right eye width

eyes open (left eye open + right eye open) lowast 100

(3)

For the movement of the eyebrows up and down and thedistance between the eyebrows the following equations wereused

left eyebrow width PrimeLeft eyebrow widthPrime

right eyebrow width PrimeRight eyebrow widthPrime

eyebrows center PrimeY minus axis position of between the eyebrowsPrime

eyebrows distance eyebrows centerlowast 2

left eyebrow width + right eyebrow width1113888 1113889 minus 071113888 1113889lowast 500

(4)

Table 3 Single activity unit (AU) of FACS (except from P Ekman and W V Friesen ldquoMeasuring Facial Movementrdquo EnvironmentalPsychology and Nonverbal Behavior vol 1 pp 56ndash75 1976)

AU number Description Facial muscle1 Inner brow raiser Frontalis (pars medialis)2 Outer brow raiser (unilateral right side) Frontalis (pars lateralis)4 Brow lowered Depressor glabellae depressor supercilii and corrugator supercilii5 Upper lid raiser Levator palpebrae superioris6 Cheek raiser Orbicularis oculi (pars orbitalis)7 Lid tightener Orbicularis oculi (pars palpebralis)10 Upper lip raiser Levator labii superioris and caput infraorbitalis12 Lip corner puller Zygomatic major14 Dimpler Buccinator15 Lip corner depressor Depressor anguli oris (triangularis)16 Lower lip depressor Depressor labii inferioris25 Lips part Depressor labii or relaxation of mentalis (AU17) or orbicularis oris26 Jaw drop Masseter temporalis and internal pterygoid relaxed27 Mouth stretch Pterygoids and digastric28 Lip suck Orbicularis oris

4 Mobile Information Systems

-e operation here is that we used the data of individualswith extreme results to calculate the maximum and mini-mum values of the distance between the eyebrows and then

mapped this value to the range of 0ndash100 -e followingsimilar operations are the same as this one

left eyebrow height PrimeDistance from the right contour of the left eyebrow to the left eyersquosPrime

right eyebrow height PrimeDistance from the left contour of the right eyebrow to the right eyersquosPrime

eyebrows height left eyebrow height + right eyebrow height

left eyebrow width + right eyebrow width1113888 1113889 minus 041113888 1113889lowast 250

(5)

In this way we determined the values for the sample AUsfrom the expression pictures needed to calculate the five-dimensional values the closure of the mouth corresponds tomouth_open the curvature of the mouth corners corre-sponds tomouth_rad the closure of the eyes corresponds toeyes_open the distance between the eyebrows correspondsto eyebrows_distance and the movement of eyebrows up anddown corresponds to eyebrows_height

221 Correlation Analysis We conducted a Pearson corre-lation analysis on the five variables obtained -e overall cor-relation coefficient is represented by ρ in a populationGenerally ρ is unknown so we needed to estimate ρ ap-proximately through a sample correlation coefficient r Becauseof the different values within the individual samples the valuesof r varied -erefore we needed to examine the suitability ofthe sample correlation coefficient and carry out a significancetest on it By using SPSS we calculated the sample AUs from theKDEF database with the formulas that were described-en weconducted the Pearson correlation analysis with the five

variables (ie the dimensions of the corresponding emotionsobtained) After that we usedmin-max normalization which isa commonly used normalization method and calculated thecorrelation coefficients of the five simple AUs -ey could thenbe used as feature weights when we calculated the five-di-mensional values -e comparison results of the correlationcoefficients of the five sample AUs are shown in Table 4

We calculated the corresponding five-dimensionalvalues of the images from the KDEF database by applying theobtained coefficients

222 Multilayer Perceptron A multilayer perceptron cansolve most linear nonseparable problems In this part of thestudy we used a multilayer perceptron which was com-posed of five layers of neurons -is network included aninput layer hidden layers and an output layer and it usedgradient clipping and dropout methods to regularize

In the multilayer perceptron we used a softmax acti-vation function and a cross-entropy loss function Weconsidered the five-dimensional data of an expression

Figure 1 Facial feature points of the KDEF expressions

Mobile Information Systems 5

picture as a group and therefore had 980 groups of data Wedivided this data into 880 groups for training and theremaining 100 groups were used as test data

After 5180104 iterations of training the accuracy of themodel achieved is shown in Table 5

In order to verify the accuracy of the model we alsoinvited 12 volunteers of different ages to identify the same100 randomly selected facial images and calculated theiraccuracy At the same time we let the model identify the 100facial images -e final results are shown in Table 6

-e results show that the accuracy of the model rec-ognition after training was higher than that of humanvolunteers

3 Comparison between the Five-DimensionalEmotional Model and the PAD EmotionalEvaluation Model

31 PAD Emotional Evaluation Model -e PAD emotionalevaluation model is a model based on emotional dimensiontheory According to the evaluation method of semanticdifferences Mehrabian and Russell defined the three di-mensions of emotion as ldquoPrdquo for pleasurendashdispleasure ldquoArdquo forarousalndashnonarousal and ldquoDrdquo for dominancendashsubmissive-ness [2ndash4] Among them P represents the positive ornegative degree of an individualrsquos emotional state A rep-resents the physiological activation level of an individualrsquosnerves indicating that the individual is excited or depressedand D represents the subjective control state an active orpassive state of the individual to the situation and others

-e PAD emotional evaluation model as the most fa-mousmodel in emotional dimension theory has been widelyused in emotional research In this part of the study we

tested the applicability of the five-dimensional emotionmodel by comparing it with the PAD emotional model

32 Contrast Experiment of Accuracy under the SameConditions For the PAD emotional model we used thesame sample and multilayer perceptron with the same inputlayer and hidden layers as the five-dimensional emotionalmodel for training

Lee et al at the Institute of Psychology ChineseAcademy of Sciences compiled the Chinese version of thePAD emotion scales and evaluated the PAD values of the 14basic emotions -is included basic emotions proposed bydifferent researchers such as happiness anger relaxationand fear [25] In the current research we used PAD valuescorresponding to seven basic emotions from the KDEFdatabase to analyze the correlation of the results -e PADvalues of the seven emotions are shown in Table 7

We analyzed the Pearson correlations between the PADvalues of emotions and the five sample AUs and took themin-max normalized coefficients as the feature weights ofthe PAD emotional evaluation model -e coefficients ob-tained are shown in Table 8

Based on these parameters we transformed the sampleAUs of the expression images of the KDEF database into thecorresponding PAD values and then trained the multilayerperceptron with the data After 6000000 iterations oftraining we were able to compare the accuracies of the PADemotional model and the five-dimensional emotion model-e final results are shown in Table 9

Based on this result we felt that the five-dimensionalemotion model was better than the PAD emotional evalu-ation model when applied to emotion recognition -ecomparison results are graphically shown in Figure 2

Table 4 Coefficients of the sample AUs corresponding to the five dimensions

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceIntensity 1000 0657 0000 0569 0341Pleasure 0325 1000 0156 0000 0346Control 0195 1000 0169 0000 0618Certainty 0000 0733 0609 0224 1000Tension 0648 0000 0755 1000 0163

Table 5 Model accuracy

Whole data Training set sampling data Test data078 079 073

Table 6 Model and volunteersrsquo recognition accuracy

ModelVolunteers

X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X074 054 064 066 061 058 068 067 069 067 066 061 068 064

6 Mobile Information Systems

4 Comparison between the Five-DimensionalEmotional Model and the MotivationalDimensional Model

-e PAD model proposes the existence of a dominancedimension in addition to the valence and arousal dimen-sions Lee [25] evaluated the values of the basic emotions onthe PAD dimension Since the three dimensions of the PADmodel are independent of each other we can eliminate theindividual dominance dimension and keep only the values ofpleasure and arousal for the valence and arousal dimensions

-e motivational dimensional model of affect points tothe existence of a motivational dimension in addition to thevalence and arousal dimensions Xia [26] designed experi-ments to assess the motivational dimension of basic emo-tions on a scale of 1ndash9 By normalizing his experimentalresults with Leersquos we can map them to a range of minus4 to 4 Wecan obtain a numerical representation of basic emotions onthe motivational dimension model which is shown inTable 10

We analyzed the Pearson correlations between themotivational dimension value and the five sample AUs andtook the min-max normalized coefficients as the featureweights of the motivational dimensional model -e coef-ficients obtained are shown in Table 11

Based on these parameters we transformed the sampleAUs of the expression images of the KDEF database into thecorresponding motivational dimension value and thentrained the multilayer perceptron with the data After2742366 iterations of training we were able to compare theaccuracy of the motivational dimensional model and thefive-dimensional emotion model -e final results are shownin Table 12

5 Application of the Five-DimensionalEmotion Model

Human emotion has a wide range of application prospectsand potential economic value in man-machine interactionelectronic education robotics entertainment the medicalfield and other areas So it has attracted a great deal ofattention in academia and industry In both arenas a largenumber of related projects about human emotion have beencarried out Some of the companies that have been interestedand involved in this research have been IBM in the UnitedStates Sony in Japan Philips in Europe and many othersChina has gradually attached importance to this kind ofresearch where it has been explored at the Beijing Universityof Science and Technology the Institute of Automation ofthe Chinese Academy of Sciences and others In a fewwords research on the five-dimensional emotion model as

Table 7 PAD values corresponding to seven emotions

EmotionAverage value

P A DHappiness 277 121 142Anger minus198 110 060Sad minus089 017 minus070Fear minus093 130 minus064Surprise 172 171 022Disgust minus180 040 067Calm minus053 minus125 minus084

Table 8 Coefficient of PAD values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceP 1000 0968 0488 0190 0000A 1000 0000 0293 0475 0150D 0930 1000 0000 0254 0635

Table 9 Comparison of accuracy between PAD and the five-dimensional emotion model

PAD Five-dimensional emotion modelWhole data 069 078Training set samplingdata 066 079

Test data 070 073

Whole data Training set sampling data Test data

090807060504030201

0

DifferenceFive-dimensional modelPAD model

Figure 2 Comparison of accuracy between PAD and the five-dimensional emotion model expressions

Mobile Information Systems 7

applied in emotion recognition has far-reaching scientificsignificance and extensive application value To this end wetook the adaptive man-machine interactive system based onthe five-dimensional emotion model as an example to il-lustrate its practical application

Man-machine interaction is a subject that studies thedesign evaluation and implementation of interactive com-puting systems for the convenience of human users Anadaptive interface is the mainstream of man-machine inter-action research and application Now with the development ofartificial intelligence emotion is more and more important inthe research of an adaptive interface-rough the introductionof the five-dimensional emotional model the man-machineinteractive interface can observe a userrsquos emotions so as toprovide a more user-friendly user experience

For a general user interface color and backgroundmusicare part of the core content Based on this we designed thestyle of our user interface to change when users were ex-periencing different emotions

Before we designed it we needed to make some adaptiveimprovements to the five-dimensional emotion modelBecause different people have different facial features be-havioral habits and uncontrollable expression changes thiswill lead to recognition errors if the model worked accordingto only one frame of expression while performing a con-tinuous emotional judgment We needed to make someimprovements to the model that is to prepare a queue oflength N each element of which was a final result of eachemotion detection When the queue was full the modelwould perform an enqueue or dequeue operation From thiswe would receive a userrsquos continuous emotion detectionresults N times Because emotion is a continuously changing

process this method would receive more accurate results byanalyzing the emotions continuously

According to Zhang Congcongrsquos experiment [27] inclassic orchestra Bachrsquos evaluation was significantly higherthan others his music in aminor key had P andD values thatwere significantly higher than the music in a major key andthe PAD values of his faster music were significantly higherthan those of his slower music In terms of color the A valueof longer wave color was higher than that of shorter wavesfor multicolor the PAD values of saturated and bright colorswere higher than those of soft and dark colors as a whole-ere was a nonlinear relationship between the D value andbrightness but in black and white the D value of black wassignificantly higher than that of white and gray

Based on the results of his study we captured a userrsquosexpression by accessing the userrsquos camera and then used thefive-dimensional emotion model to detect the userrsquos con-tinuous emotions and calculate the average value of thedimensions For this we could determine which dimensionsneeded to be strengthened most Finally through colorgradient animation music smooth switching and otherapplications of JavaScript the model displayed the interfacestyle and background music that could improve the requireddimension value and guide the userrsquos emotion in a positivedirection

For example when the user was in a low-pleasure andlow-control mood we could use high-saturation blue as themain style and Bachrsquos fast minor-keyed Orchestral Suite No 2in B Minor BWV 1067 VII Badinerie as the backgroundmusic to improve the userrsquos pleasure and control When theuser was in a high pleasure mood we could use high-satu-ration red as the main style and Bachrsquos fast major-keyed

Table 11 Coefficient of the motivational dimension values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceValence 0714 1000 0354 0136 0000Arousal 1000 0329 0200 0432 0000Motivational 0309 1000 0183 0000 0331

Table 12 Comparison of accuracy between the motivational dimensional model and the five-dimensional emotion model

Motivational dimensional model Five-dimensional emotion modelComplete data 071 078Training set sampling data 068 079Test data 071 073

Table 10 -e motivational dimension values corresponding to seven emotions

EmotionAverage value

Valence Arousal MotivationalHappiness 277 121 134Anger minus198 110 minus059Sad minus089 017 minus096Fear minus093 130 minus143Surprise 172 171 minus032Disgust minus180 040 minus136Calm minus053 minus125 minus003

8 Mobile Information Systems

Brandenburg Concerto No 2 in F BWV1047I [Allegro] asbackground music to improve the userrsquos intensity and keepthe user happy for a longer time

6 Conclusion

Artificial intelligence has played a promising and productiverole in many fields but an AI that cannot distinguish emotioncan only exist as a machine and loses much of the humanexperience -erefore we constructed the five-dimensionalemotion model in this paper and made research and ex-ploration in the field of emotion recognition -e highlightsare as follows

(i) We determined five dimensions of the proposedemotion model as intensity pleasure control cer-tainty and tension and simplified the AUs of theFACS Based on the KDEF database we determinedthe feature weights of the five dimensions corre-sponding to the sample AUs and used a multilayerperceptron to train our model with higher accuracythan recognition by human volunteers

(ii) We compared the five-dimensional emotion modelwith the traditional PAD emotional evaluationmodel under the same conditions and demonstratedthat our proposed model was better in terms ofaccuracy

(iii) Finally we discussed the practical application of thefive-dimensional emotion model with the illustra-tion of a prototype emotion-detecting interface asan example

Data Availability

-e data used in the study are available in Karolinska Di-rected Emotional Faces (KDEF) httpswwwkdefseindexhtml

Conflicts of Interest

-e authors declare that there are no conflicts of interestregarding the publication of this paper

References

[1] W James Functional Psychology Hubei Science amp Tech-nology Press Wuhan China 2016

[2] A Mehrabian ldquoFramework for a comprehensive descriptionand measurement of emotional statesrdquo Genetic Social andGeneral Psychology Monographs vol 121 no 3 pp 339ndash3611995

[3] A Mehrabian ldquoPleasure-arousal-dominance a generalframework for describing and measuring individual differ-ences in temperamentrdquo Current Psychology vol 14 no 4pp 261ndash292 1996

[4] A Mehrabian C Wihardja and E Ljunggren ldquoEmotionalcorrelates of preferences for situation-activity combinationsin everyday liferdquo Genetic Social and General PsychologyMonographs vol 123 no 4 pp 461ndash477 1997

[5] N H Frijda 9e Emotions pp 62ndash65 Cambridge UniversityPress Cambridge UK 1986

[6] C E Izard Human Emotions Springer New York NY USA1977

[7] L Wang ldquoResearch on Agent emotion model based on theOCC modelrdquo Microcomputer Information vol 23 2007

[8] J Han ldquoCognitive emotionmodel for eldercare robot in smarthomerdquo China Communications vol 12 no 4 pp 32ndash41 2015

[9] Z Mi ldquoResearch on dimension emotion recognition modelbased on ConvLSTM networkrdquo Computer Engineering andApplications 2020

[10] S Wang ldquoPerceptual image retrieval based on emotionmodelrdquo Journal of Circuits and Systems vol 8 no 6 2003

[11] C Fu C Liu C T Ishi and H Ishiguro ldquoMulti-modalityemotion recognition model with GAT-based multi-head in-ter-modality attentionrdquo Sensors vol 20 no 17 p 4894 2020

[12] H Cai ldquoEmotion classification model based on word em-bedding and CNNrdquo Application Research of Computersvol 33 no 10 2016

[13] P A Gable and E Harmon-Jones ldquoApproach-motivatedpositive affect reduces breadth of attentionrdquo PsychologicalScience vol 19 no 5 pp 476ndash482 2008

[14] P Gable and E Harmon-Jones ldquo-e blues broaden but thenasty narrowsrdquo Psychological Science vol 21 no 2pp 211ndash215 2010

[15] P A Gable and E Harmon-Jones ldquo-e effect of low versushigh approach-motivated positive affect on memory for pe-ripherally versus centrally presented informationrdquo Emotionvol 10 no 4 pp 599ndash603 2010

[16] P Gable and E Harmon-Jones ldquo-e motivational dimen-sional model of affect implications for breadth of attentionmemory and cognitive categorisationrdquo Cognition amp Emotionvol 24 no 2 pp 322ndash337 2010

[17] P Ekman W V Friesen and S S Tomkins9e Facial AffectScoring Technique (FAST) Consulting Psychologists PressPalo Alto CA USA 1971

[18] C A Smith and P C Ellsworth ldquoPatterns of cognitive ap-praisal in emotionrdquo Journal of Personality and Social Psy-chology vol 48 no 4 pp 813ndash838 1985

[19] J W Kalat and M N Shiota Emotions China Light IndustryPress Beijing China 2009

[20] Z Meng ldquoA preliminary attempt to determine the facialexpression patterns of babiesrdquo Acta Psychologica Sinica vol 1pp 55ndash61 1985

[21] C E Izard ldquoEmotions as motivationsrdquo in Nebraska Sym-posium on Motivation pp 163ndash200 University of NebraskaPress Lincoln NE USA 1978

[22] Z Meng ldquoWhy facial expressions can be used as objectiveindicators of emotion researchrdquo Acta Psychologica Sinicavol 2 pp 124ndash128 1987

[23] P Ekman W V Friesen and S S Tomkins9e Facial ActionCoding System (FACS) Consulting Psychologists Press PaloAlto CA USA 1978

[24] D Lundqvist A Flykt and A Ohman 9e Karolinska Di-rected Emotional Faces-KDEF Karolinska Institute SolnaSweden 1998 httpswwwkdefseindexhtml

Mobile Information Systems 9

[25] X Lee PAD 9ree-Dimensional Model China ComputerWorld Beijing China 2007

[26] M XiaConvergence and Avoidance of Efficacy andMotivationDimensions in Emotion Recognition and Efficacy PreferenceEffects Southwest University Chongqing China 2015

[27] C Zhang Emotional Relationship between Music and ColorEast China Normal University Shanghai China 2014

10 Mobile Information Systems

Page 2: TheStudyofaFive-DimensionalEmotionalModelforFacial ... · pictureasagroupandthereforehad980groupsofdata.We dividedthisdatainto880groupsfortraining,andthe remaining100groupswereusedastestdata

implied that an emotion space can be established usingappropriate methods in which each emotion can be seen as avector of that space [10] Fu et al [11] and Cai [12] used aneural network to directly train data features to obtain amodel On the basis of a series of experimental studies[13ndash15] Gable and Harmon-Jones first directly tested thehypothesis that motivation-related factors of emotion mayaffect cognitive processing such as attention and memoryand proposed the motivational dimensional model of affect[16] Many of these studies have not been adapted well to thefield of emotion recognition and this is why we propose thefive-dimensional emotion model

2 Five-Dimensional Emotion Model

In view of the shortcomings of those existing models for ourpurposes we deduced and defined five dimensions related toemotion and then built a five-dimensional emotion modeland evaluated it

21 Determine Dimensions In 1971 Ekman developed theFacial Affect Scoring Technique (FAST) [17] which is anemotional measurement system based on facial musclemovement components According to this system he listedsix basic emotions happiness anger fear sadness surpriseand disgust On the premise that this dimensional emotiontheory provided an objective truth there must be differentemotional dimension values that could be assigned to dif-ferent basic emotions Accordingly we established a methodin reverse which determines basic emotions from inferreddimensions

Because ldquodisgustrdquo can be understood as an evaluation ofexternal objects or stimuli which was not conducive todetermine the dimensions it was temporarily excluded inthe derivation Based on two existing emotion models wefirst assumed two known dimensions intensity and pleasureIntensity is used to describe the degree of individual neuralactivation pleasure is used to distinguish the positive andnegative aspects of emotions -ese dimensions are internalexperiences belonging to an individualrsquos own feelings-osefeelings can be divided into internal dimensions Intensity isnot a determinant of emotion it is only used to describe thestrength of an emotion Pleasure distinguishes happinessfrom the other five basic emotions and makes happinessexist as a positive emotion

In 1985 Smith and Ellsworth determined six dimensionsof emotion through experimentation certainty pleasureattention activity control expected effort and sense of re-sponsibility-ey then proposed a cognitive evaluation theoryof emotion [18] According to this theory the core dimen-sions that distinguish anger from other negative emotions arecertainty control and responsibility Anger is an emotionthat often occurs when something negative happens to theindividual and others were responsible andor have controlAlso the individual has a fair sense of certainty about theevent Fear is an emotion that appears often when the en-vironment controls negative events With it individuals havea high sense of uncertainty for the negative objective events

In Kalat and Shiotarsquos study of emotion they pointed out thatunder the same conditions the difference between anger andsadness is whether people can control the situation [19] Sowe can determine two dimensions to distinguish negativeemotions control and certainty

Surprise and fear are similar in a large number of casesIn a 1985 study Meng [20] determined the facial expressionpatterns of infants -ey asked strangers to hold rats in frontof infants to cause fear and presented rats that ran freely in abox to cause surprise in the infants -ey obtained thepredicted correct results in this experiment It can be seenfrom this comparison that the key variable determining fearor surprise was whether the infants increased their responseintensity to the external stimuli namely tension From thiswe can determine the dimension that distinguishes surprisefrom fear tension

Because the three dimensions described above are re-lated to cognition and they lead to emotional differences wecan categorize them as external dimensions

To reduce the potential error caused by the failure ofindividuals to analyze emotions well we employed 65 peopleto evaluate the relationships between the basic emotions andthe five dimensions at six levels of very high (100) high (75)medium (50) low (25) very low (0) and no correlation(empty) In addition to the given level some people thinkthat the difference between these levels is too large and thusfill in other values such as 30 -is was not expected to havean impact on the experimental results

In order to reduce the error we deleted outliers (definedas data with very large deviations) Based on this we thoughtthat a relationship with too high a standard deviation andtoo large a dispersion was irrelevant for our use becausedifferent individuals were expressing different feelings -estandard deviation between the five basic emotions and thefive dimensions is shown in Table 1

Finally we calculated the average of the remaining dataand determined the corresponding level relationships be-tween the five dimensions and the basic emotions -erelationship is shown in Table 2

22VerifyDimensions Izard points out that emotion playsa key role in adaptation and survival Emotions haveadaptability and mobility -e evidence of this conclusionis that in a study of comparative anatomy and compar-ative psychology it was confirmed that facial activitiesplay an important role in the lives of some species ofvertebrates this emphasizes the role of facial expression inemotion research [21] On this basis Meng Zhaolanproposed that facial expression can be used as basic meansof determining emotions in research [22]

-erefore we can verify the five dimensions of ourmodel through the study of facial motion or expression

Ekman and other researchers built the Facial ActionCoding System (FACS) in the mid-1970s by studying theirown facial expressions [23] -is system is not based onmuscles but on facial activity namely an activity unit (AU)Each AU is associated with one or more muscles -e AUswe used are shown in Table 3

2 Mobile Information Systems

An individual can generally judge anotherrsquos emotions bya quick scan of their expressions In this case the visualimage of the face is very hazy and many AUs cannot berecognized clearly -is would suggest that there is no needfor so many AUs in judging emotions So we could simplifythe table of active units and merged the 24 AUs into fivesimple AUs the movement of the eyebrows up and downthe distance between the eyebrows the closure of the eyesthe curvature of the mouth corners and the closure of themouth

By using the Karolinska Directed Emotional Faces(KDEF) [24] a dataset of facial expressions and corre-sponding emotions provided by the Karolinska Institute to

analyze the simple AUs we could verify the accuracy of ourmodel

Using the Face++ engine we extracted 20 feature pointsfrom the KDEF expressions (see Figure 1)

-en we calculated the coordinates of these featurepoints in order to get the quantized values of each sampleAU

For the closure of the mouth the following equationswere used

mouth reference PrimeDistance from top of lower lip to chinPrime

mouth height PrimeDistance from top of lower lip to top of upper lipPrime

mouth open mouth height

mouth referencelowast 100

(1)

-e reason for introducing the mouth_reference here isthat it can eliminate the influence of the camerarsquos distanceand proximity on the coordinate value by finding a referencesystem in the face-e other similar operations are the same

For the curvature of the mouth corners the followingequations were used

lip up PrimeY minus axis position of upper lipPrime

lip down PrimeY minus axis position of lower lipPrime

lip center PrimeAverage number of y minus axis positions of the corners of themouth on both sidesPrime

lip up0 |lip up minus lip center|

lip down0 |lip down minus lip center|

mouth rad lip down0

lip up0 + lip down0lowast 100

(2)

Table 1 -e standard deviation between the five basic emotionsand the five dimensions

Intensity Pleasure Control Certainty TensionHappiness 6288 8528 6102 5225 3176Anger 7103 2977 6008 5422 5827Sadness 5953 2820 4592 5546 4538Fear 6798 2601 407 4204 7184Surprise 6059 3938 5237 4909 3735

Table 2 -e level relationship between the five basic emotions andthe five dimensions

Intensity Pleasure Control Certainty TensionHappiness -uarr uarr -uarr - -darrAnger uarr darr -uarr - -uarrSadness -uarr darr - - -Fear -uarr darr -darr -darr uarrSurprise -uarr -darr - - -darruarr stands for a very high value darr stands for a very low value -darr stands for alow value -uarr stands for a high value and - stands for a medium value emptyfor irrelevant

Mobile Information Systems 3

For the closure of the eyes the following equations wereused

left eye width PrimeLeft eye width

Prime

left eye height PrimeDistance from the left eyelid buttom to the top

left eye open left eye height

left eye width

right eye width PrimeRight eye widthPrime

right eye height PrimeDistance from the right eyelid buttom to the topPrime

right eye open right eye height

right eye width

eyes open (left eye open + right eye open) lowast 100

(3)

For the movement of the eyebrows up and down and thedistance between the eyebrows the following equations wereused

left eyebrow width PrimeLeft eyebrow widthPrime

right eyebrow width PrimeRight eyebrow widthPrime

eyebrows center PrimeY minus axis position of between the eyebrowsPrime

eyebrows distance eyebrows centerlowast 2

left eyebrow width + right eyebrow width1113888 1113889 minus 071113888 1113889lowast 500

(4)

Table 3 Single activity unit (AU) of FACS (except from P Ekman and W V Friesen ldquoMeasuring Facial Movementrdquo EnvironmentalPsychology and Nonverbal Behavior vol 1 pp 56ndash75 1976)

AU number Description Facial muscle1 Inner brow raiser Frontalis (pars medialis)2 Outer brow raiser (unilateral right side) Frontalis (pars lateralis)4 Brow lowered Depressor glabellae depressor supercilii and corrugator supercilii5 Upper lid raiser Levator palpebrae superioris6 Cheek raiser Orbicularis oculi (pars orbitalis)7 Lid tightener Orbicularis oculi (pars palpebralis)10 Upper lip raiser Levator labii superioris and caput infraorbitalis12 Lip corner puller Zygomatic major14 Dimpler Buccinator15 Lip corner depressor Depressor anguli oris (triangularis)16 Lower lip depressor Depressor labii inferioris25 Lips part Depressor labii or relaxation of mentalis (AU17) or orbicularis oris26 Jaw drop Masseter temporalis and internal pterygoid relaxed27 Mouth stretch Pterygoids and digastric28 Lip suck Orbicularis oris

4 Mobile Information Systems

-e operation here is that we used the data of individualswith extreme results to calculate the maximum and mini-mum values of the distance between the eyebrows and then

mapped this value to the range of 0ndash100 -e followingsimilar operations are the same as this one

left eyebrow height PrimeDistance from the right contour of the left eyebrow to the left eyersquosPrime

right eyebrow height PrimeDistance from the left contour of the right eyebrow to the right eyersquosPrime

eyebrows height left eyebrow height + right eyebrow height

left eyebrow width + right eyebrow width1113888 1113889 minus 041113888 1113889lowast 250

(5)

In this way we determined the values for the sample AUsfrom the expression pictures needed to calculate the five-dimensional values the closure of the mouth corresponds tomouth_open the curvature of the mouth corners corre-sponds tomouth_rad the closure of the eyes corresponds toeyes_open the distance between the eyebrows correspondsto eyebrows_distance and the movement of eyebrows up anddown corresponds to eyebrows_height

221 Correlation Analysis We conducted a Pearson corre-lation analysis on the five variables obtained -e overall cor-relation coefficient is represented by ρ in a populationGenerally ρ is unknown so we needed to estimate ρ ap-proximately through a sample correlation coefficient r Becauseof the different values within the individual samples the valuesof r varied -erefore we needed to examine the suitability ofthe sample correlation coefficient and carry out a significancetest on it By using SPSS we calculated the sample AUs from theKDEF database with the formulas that were described-en weconducted the Pearson correlation analysis with the five

variables (ie the dimensions of the corresponding emotionsobtained) After that we usedmin-max normalization which isa commonly used normalization method and calculated thecorrelation coefficients of the five simple AUs -ey could thenbe used as feature weights when we calculated the five-di-mensional values -e comparison results of the correlationcoefficients of the five sample AUs are shown in Table 4

We calculated the corresponding five-dimensionalvalues of the images from the KDEF database by applying theobtained coefficients

222 Multilayer Perceptron A multilayer perceptron cansolve most linear nonseparable problems In this part of thestudy we used a multilayer perceptron which was com-posed of five layers of neurons -is network included aninput layer hidden layers and an output layer and it usedgradient clipping and dropout methods to regularize

In the multilayer perceptron we used a softmax acti-vation function and a cross-entropy loss function Weconsidered the five-dimensional data of an expression

Figure 1 Facial feature points of the KDEF expressions

Mobile Information Systems 5

picture as a group and therefore had 980 groups of data Wedivided this data into 880 groups for training and theremaining 100 groups were used as test data

After 5180104 iterations of training the accuracy of themodel achieved is shown in Table 5

In order to verify the accuracy of the model we alsoinvited 12 volunteers of different ages to identify the same100 randomly selected facial images and calculated theiraccuracy At the same time we let the model identify the 100facial images -e final results are shown in Table 6

-e results show that the accuracy of the model rec-ognition after training was higher than that of humanvolunteers

3 Comparison between the Five-DimensionalEmotional Model and the PAD EmotionalEvaluation Model

31 PAD Emotional Evaluation Model -e PAD emotionalevaluation model is a model based on emotional dimensiontheory According to the evaluation method of semanticdifferences Mehrabian and Russell defined the three di-mensions of emotion as ldquoPrdquo for pleasurendashdispleasure ldquoArdquo forarousalndashnonarousal and ldquoDrdquo for dominancendashsubmissive-ness [2ndash4] Among them P represents the positive ornegative degree of an individualrsquos emotional state A rep-resents the physiological activation level of an individualrsquosnerves indicating that the individual is excited or depressedand D represents the subjective control state an active orpassive state of the individual to the situation and others

-e PAD emotional evaluation model as the most fa-mousmodel in emotional dimension theory has been widelyused in emotional research In this part of the study we

tested the applicability of the five-dimensional emotionmodel by comparing it with the PAD emotional model

32 Contrast Experiment of Accuracy under the SameConditions For the PAD emotional model we used thesame sample and multilayer perceptron with the same inputlayer and hidden layers as the five-dimensional emotionalmodel for training

Lee et al at the Institute of Psychology ChineseAcademy of Sciences compiled the Chinese version of thePAD emotion scales and evaluated the PAD values of the 14basic emotions -is included basic emotions proposed bydifferent researchers such as happiness anger relaxationand fear [25] In the current research we used PAD valuescorresponding to seven basic emotions from the KDEFdatabase to analyze the correlation of the results -e PADvalues of the seven emotions are shown in Table 7

We analyzed the Pearson correlations between the PADvalues of emotions and the five sample AUs and took themin-max normalized coefficients as the feature weights ofthe PAD emotional evaluation model -e coefficients ob-tained are shown in Table 8

Based on these parameters we transformed the sampleAUs of the expression images of the KDEF database into thecorresponding PAD values and then trained the multilayerperceptron with the data After 6000000 iterations oftraining we were able to compare the accuracies of the PADemotional model and the five-dimensional emotion model-e final results are shown in Table 9

Based on this result we felt that the five-dimensionalemotion model was better than the PAD emotional evalu-ation model when applied to emotion recognition -ecomparison results are graphically shown in Figure 2

Table 4 Coefficients of the sample AUs corresponding to the five dimensions

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceIntensity 1000 0657 0000 0569 0341Pleasure 0325 1000 0156 0000 0346Control 0195 1000 0169 0000 0618Certainty 0000 0733 0609 0224 1000Tension 0648 0000 0755 1000 0163

Table 5 Model accuracy

Whole data Training set sampling data Test data078 079 073

Table 6 Model and volunteersrsquo recognition accuracy

ModelVolunteers

X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X074 054 064 066 061 058 068 067 069 067 066 061 068 064

6 Mobile Information Systems

4 Comparison between the Five-DimensionalEmotional Model and the MotivationalDimensional Model

-e PAD model proposes the existence of a dominancedimension in addition to the valence and arousal dimen-sions Lee [25] evaluated the values of the basic emotions onthe PAD dimension Since the three dimensions of the PADmodel are independent of each other we can eliminate theindividual dominance dimension and keep only the values ofpleasure and arousal for the valence and arousal dimensions

-e motivational dimensional model of affect points tothe existence of a motivational dimension in addition to thevalence and arousal dimensions Xia [26] designed experi-ments to assess the motivational dimension of basic emo-tions on a scale of 1ndash9 By normalizing his experimentalresults with Leersquos we can map them to a range of minus4 to 4 Wecan obtain a numerical representation of basic emotions onthe motivational dimension model which is shown inTable 10

We analyzed the Pearson correlations between themotivational dimension value and the five sample AUs andtook the min-max normalized coefficients as the featureweights of the motivational dimensional model -e coef-ficients obtained are shown in Table 11

Based on these parameters we transformed the sampleAUs of the expression images of the KDEF database into thecorresponding motivational dimension value and thentrained the multilayer perceptron with the data After2742366 iterations of training we were able to compare theaccuracy of the motivational dimensional model and thefive-dimensional emotion model -e final results are shownin Table 12

5 Application of the Five-DimensionalEmotion Model

Human emotion has a wide range of application prospectsand potential economic value in man-machine interactionelectronic education robotics entertainment the medicalfield and other areas So it has attracted a great deal ofattention in academia and industry In both arenas a largenumber of related projects about human emotion have beencarried out Some of the companies that have been interestedand involved in this research have been IBM in the UnitedStates Sony in Japan Philips in Europe and many othersChina has gradually attached importance to this kind ofresearch where it has been explored at the Beijing Universityof Science and Technology the Institute of Automation ofthe Chinese Academy of Sciences and others In a fewwords research on the five-dimensional emotion model as

Table 7 PAD values corresponding to seven emotions

EmotionAverage value

P A DHappiness 277 121 142Anger minus198 110 060Sad minus089 017 minus070Fear minus093 130 minus064Surprise 172 171 022Disgust minus180 040 067Calm minus053 minus125 minus084

Table 8 Coefficient of PAD values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceP 1000 0968 0488 0190 0000A 1000 0000 0293 0475 0150D 0930 1000 0000 0254 0635

Table 9 Comparison of accuracy between PAD and the five-dimensional emotion model

PAD Five-dimensional emotion modelWhole data 069 078Training set samplingdata 066 079

Test data 070 073

Whole data Training set sampling data Test data

090807060504030201

0

DifferenceFive-dimensional modelPAD model

Figure 2 Comparison of accuracy between PAD and the five-dimensional emotion model expressions

Mobile Information Systems 7

applied in emotion recognition has far-reaching scientificsignificance and extensive application value To this end wetook the adaptive man-machine interactive system based onthe five-dimensional emotion model as an example to il-lustrate its practical application

Man-machine interaction is a subject that studies thedesign evaluation and implementation of interactive com-puting systems for the convenience of human users Anadaptive interface is the mainstream of man-machine inter-action research and application Now with the development ofartificial intelligence emotion is more and more important inthe research of an adaptive interface-rough the introductionof the five-dimensional emotional model the man-machineinteractive interface can observe a userrsquos emotions so as toprovide a more user-friendly user experience

For a general user interface color and backgroundmusicare part of the core content Based on this we designed thestyle of our user interface to change when users were ex-periencing different emotions

Before we designed it we needed to make some adaptiveimprovements to the five-dimensional emotion modelBecause different people have different facial features be-havioral habits and uncontrollable expression changes thiswill lead to recognition errors if the model worked accordingto only one frame of expression while performing a con-tinuous emotional judgment We needed to make someimprovements to the model that is to prepare a queue oflength N each element of which was a final result of eachemotion detection When the queue was full the modelwould perform an enqueue or dequeue operation From thiswe would receive a userrsquos continuous emotion detectionresults N times Because emotion is a continuously changing

process this method would receive more accurate results byanalyzing the emotions continuously

According to Zhang Congcongrsquos experiment [27] inclassic orchestra Bachrsquos evaluation was significantly higherthan others his music in aminor key had P andD values thatwere significantly higher than the music in a major key andthe PAD values of his faster music were significantly higherthan those of his slower music In terms of color the A valueof longer wave color was higher than that of shorter wavesfor multicolor the PAD values of saturated and bright colorswere higher than those of soft and dark colors as a whole-ere was a nonlinear relationship between the D value andbrightness but in black and white the D value of black wassignificantly higher than that of white and gray

Based on the results of his study we captured a userrsquosexpression by accessing the userrsquos camera and then used thefive-dimensional emotion model to detect the userrsquos con-tinuous emotions and calculate the average value of thedimensions For this we could determine which dimensionsneeded to be strengthened most Finally through colorgradient animation music smooth switching and otherapplications of JavaScript the model displayed the interfacestyle and background music that could improve the requireddimension value and guide the userrsquos emotion in a positivedirection

For example when the user was in a low-pleasure andlow-control mood we could use high-saturation blue as themain style and Bachrsquos fast minor-keyed Orchestral Suite No 2in B Minor BWV 1067 VII Badinerie as the backgroundmusic to improve the userrsquos pleasure and control When theuser was in a high pleasure mood we could use high-satu-ration red as the main style and Bachrsquos fast major-keyed

Table 11 Coefficient of the motivational dimension values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceValence 0714 1000 0354 0136 0000Arousal 1000 0329 0200 0432 0000Motivational 0309 1000 0183 0000 0331

Table 12 Comparison of accuracy between the motivational dimensional model and the five-dimensional emotion model

Motivational dimensional model Five-dimensional emotion modelComplete data 071 078Training set sampling data 068 079Test data 071 073

Table 10 -e motivational dimension values corresponding to seven emotions

EmotionAverage value

Valence Arousal MotivationalHappiness 277 121 134Anger minus198 110 minus059Sad minus089 017 minus096Fear minus093 130 minus143Surprise 172 171 minus032Disgust minus180 040 minus136Calm minus053 minus125 minus003

8 Mobile Information Systems

Brandenburg Concerto No 2 in F BWV1047I [Allegro] asbackground music to improve the userrsquos intensity and keepthe user happy for a longer time

6 Conclusion

Artificial intelligence has played a promising and productiverole in many fields but an AI that cannot distinguish emotioncan only exist as a machine and loses much of the humanexperience -erefore we constructed the five-dimensionalemotion model in this paper and made research and ex-ploration in the field of emotion recognition -e highlightsare as follows

(i) We determined five dimensions of the proposedemotion model as intensity pleasure control cer-tainty and tension and simplified the AUs of theFACS Based on the KDEF database we determinedthe feature weights of the five dimensions corre-sponding to the sample AUs and used a multilayerperceptron to train our model with higher accuracythan recognition by human volunteers

(ii) We compared the five-dimensional emotion modelwith the traditional PAD emotional evaluationmodel under the same conditions and demonstratedthat our proposed model was better in terms ofaccuracy

(iii) Finally we discussed the practical application of thefive-dimensional emotion model with the illustra-tion of a prototype emotion-detecting interface asan example

Data Availability

-e data used in the study are available in Karolinska Di-rected Emotional Faces (KDEF) httpswwwkdefseindexhtml

Conflicts of Interest

-e authors declare that there are no conflicts of interestregarding the publication of this paper

References

[1] W James Functional Psychology Hubei Science amp Tech-nology Press Wuhan China 2016

[2] A Mehrabian ldquoFramework for a comprehensive descriptionand measurement of emotional statesrdquo Genetic Social andGeneral Psychology Monographs vol 121 no 3 pp 339ndash3611995

[3] A Mehrabian ldquoPleasure-arousal-dominance a generalframework for describing and measuring individual differ-ences in temperamentrdquo Current Psychology vol 14 no 4pp 261ndash292 1996

[4] A Mehrabian C Wihardja and E Ljunggren ldquoEmotionalcorrelates of preferences for situation-activity combinationsin everyday liferdquo Genetic Social and General PsychologyMonographs vol 123 no 4 pp 461ndash477 1997

[5] N H Frijda 9e Emotions pp 62ndash65 Cambridge UniversityPress Cambridge UK 1986

[6] C E Izard Human Emotions Springer New York NY USA1977

[7] L Wang ldquoResearch on Agent emotion model based on theOCC modelrdquo Microcomputer Information vol 23 2007

[8] J Han ldquoCognitive emotionmodel for eldercare robot in smarthomerdquo China Communications vol 12 no 4 pp 32ndash41 2015

[9] Z Mi ldquoResearch on dimension emotion recognition modelbased on ConvLSTM networkrdquo Computer Engineering andApplications 2020

[10] S Wang ldquoPerceptual image retrieval based on emotionmodelrdquo Journal of Circuits and Systems vol 8 no 6 2003

[11] C Fu C Liu C T Ishi and H Ishiguro ldquoMulti-modalityemotion recognition model with GAT-based multi-head in-ter-modality attentionrdquo Sensors vol 20 no 17 p 4894 2020

[12] H Cai ldquoEmotion classification model based on word em-bedding and CNNrdquo Application Research of Computersvol 33 no 10 2016

[13] P A Gable and E Harmon-Jones ldquoApproach-motivatedpositive affect reduces breadth of attentionrdquo PsychologicalScience vol 19 no 5 pp 476ndash482 2008

[14] P Gable and E Harmon-Jones ldquo-e blues broaden but thenasty narrowsrdquo Psychological Science vol 21 no 2pp 211ndash215 2010

[15] P A Gable and E Harmon-Jones ldquo-e effect of low versushigh approach-motivated positive affect on memory for pe-ripherally versus centrally presented informationrdquo Emotionvol 10 no 4 pp 599ndash603 2010

[16] P Gable and E Harmon-Jones ldquo-e motivational dimen-sional model of affect implications for breadth of attentionmemory and cognitive categorisationrdquo Cognition amp Emotionvol 24 no 2 pp 322ndash337 2010

[17] P Ekman W V Friesen and S S Tomkins9e Facial AffectScoring Technique (FAST) Consulting Psychologists PressPalo Alto CA USA 1971

[18] C A Smith and P C Ellsworth ldquoPatterns of cognitive ap-praisal in emotionrdquo Journal of Personality and Social Psy-chology vol 48 no 4 pp 813ndash838 1985

[19] J W Kalat and M N Shiota Emotions China Light IndustryPress Beijing China 2009

[20] Z Meng ldquoA preliminary attempt to determine the facialexpression patterns of babiesrdquo Acta Psychologica Sinica vol 1pp 55ndash61 1985

[21] C E Izard ldquoEmotions as motivationsrdquo in Nebraska Sym-posium on Motivation pp 163ndash200 University of NebraskaPress Lincoln NE USA 1978

[22] Z Meng ldquoWhy facial expressions can be used as objectiveindicators of emotion researchrdquo Acta Psychologica Sinicavol 2 pp 124ndash128 1987

[23] P Ekman W V Friesen and S S Tomkins9e Facial ActionCoding System (FACS) Consulting Psychologists Press PaloAlto CA USA 1978

[24] D Lundqvist A Flykt and A Ohman 9e Karolinska Di-rected Emotional Faces-KDEF Karolinska Institute SolnaSweden 1998 httpswwwkdefseindexhtml

Mobile Information Systems 9

[25] X Lee PAD 9ree-Dimensional Model China ComputerWorld Beijing China 2007

[26] M XiaConvergence and Avoidance of Efficacy andMotivationDimensions in Emotion Recognition and Efficacy PreferenceEffects Southwest University Chongqing China 2015

[27] C Zhang Emotional Relationship between Music and ColorEast China Normal University Shanghai China 2014

10 Mobile Information Systems

Page 3: TheStudyofaFive-DimensionalEmotionalModelforFacial ... · pictureasagroupandthereforehad980groupsofdata.We dividedthisdatainto880groupsfortraining,andthe remaining100groupswereusedastestdata

An individual can generally judge anotherrsquos emotions bya quick scan of their expressions In this case the visualimage of the face is very hazy and many AUs cannot berecognized clearly -is would suggest that there is no needfor so many AUs in judging emotions So we could simplifythe table of active units and merged the 24 AUs into fivesimple AUs the movement of the eyebrows up and downthe distance between the eyebrows the closure of the eyesthe curvature of the mouth corners and the closure of themouth

By using the Karolinska Directed Emotional Faces(KDEF) [24] a dataset of facial expressions and corre-sponding emotions provided by the Karolinska Institute to

analyze the simple AUs we could verify the accuracy of ourmodel

Using the Face++ engine we extracted 20 feature pointsfrom the KDEF expressions (see Figure 1)

-en we calculated the coordinates of these featurepoints in order to get the quantized values of each sampleAU

For the closure of the mouth the following equationswere used

mouth reference PrimeDistance from top of lower lip to chinPrime

mouth height PrimeDistance from top of lower lip to top of upper lipPrime

mouth open mouth height

mouth referencelowast 100

(1)

-e reason for introducing the mouth_reference here isthat it can eliminate the influence of the camerarsquos distanceand proximity on the coordinate value by finding a referencesystem in the face-e other similar operations are the same

For the curvature of the mouth corners the followingequations were used

lip up PrimeY minus axis position of upper lipPrime

lip down PrimeY minus axis position of lower lipPrime

lip center PrimeAverage number of y minus axis positions of the corners of themouth on both sidesPrime

lip up0 |lip up minus lip center|

lip down0 |lip down minus lip center|

mouth rad lip down0

lip up0 + lip down0lowast 100

(2)

Table 1 -e standard deviation between the five basic emotionsand the five dimensions

Intensity Pleasure Control Certainty TensionHappiness 6288 8528 6102 5225 3176Anger 7103 2977 6008 5422 5827Sadness 5953 2820 4592 5546 4538Fear 6798 2601 407 4204 7184Surprise 6059 3938 5237 4909 3735

Table 2 -e level relationship between the five basic emotions andthe five dimensions

Intensity Pleasure Control Certainty TensionHappiness -uarr uarr -uarr - -darrAnger uarr darr -uarr - -uarrSadness -uarr darr - - -Fear -uarr darr -darr -darr uarrSurprise -uarr -darr - - -darruarr stands for a very high value darr stands for a very low value -darr stands for alow value -uarr stands for a high value and - stands for a medium value emptyfor irrelevant

Mobile Information Systems 3

For the closure of the eyes the following equations wereused

left eye width PrimeLeft eye width

Prime

left eye height PrimeDistance from the left eyelid buttom to the top

left eye open left eye height

left eye width

right eye width PrimeRight eye widthPrime

right eye height PrimeDistance from the right eyelid buttom to the topPrime

right eye open right eye height

right eye width

eyes open (left eye open + right eye open) lowast 100

(3)

For the movement of the eyebrows up and down and thedistance between the eyebrows the following equations wereused

left eyebrow width PrimeLeft eyebrow widthPrime

right eyebrow width PrimeRight eyebrow widthPrime

eyebrows center PrimeY minus axis position of between the eyebrowsPrime

eyebrows distance eyebrows centerlowast 2

left eyebrow width + right eyebrow width1113888 1113889 minus 071113888 1113889lowast 500

(4)

Table 3 Single activity unit (AU) of FACS (except from P Ekman and W V Friesen ldquoMeasuring Facial Movementrdquo EnvironmentalPsychology and Nonverbal Behavior vol 1 pp 56ndash75 1976)

AU number Description Facial muscle1 Inner brow raiser Frontalis (pars medialis)2 Outer brow raiser (unilateral right side) Frontalis (pars lateralis)4 Brow lowered Depressor glabellae depressor supercilii and corrugator supercilii5 Upper lid raiser Levator palpebrae superioris6 Cheek raiser Orbicularis oculi (pars orbitalis)7 Lid tightener Orbicularis oculi (pars palpebralis)10 Upper lip raiser Levator labii superioris and caput infraorbitalis12 Lip corner puller Zygomatic major14 Dimpler Buccinator15 Lip corner depressor Depressor anguli oris (triangularis)16 Lower lip depressor Depressor labii inferioris25 Lips part Depressor labii or relaxation of mentalis (AU17) or orbicularis oris26 Jaw drop Masseter temporalis and internal pterygoid relaxed27 Mouth stretch Pterygoids and digastric28 Lip suck Orbicularis oris

4 Mobile Information Systems

-e operation here is that we used the data of individualswith extreme results to calculate the maximum and mini-mum values of the distance between the eyebrows and then

mapped this value to the range of 0ndash100 -e followingsimilar operations are the same as this one

left eyebrow height PrimeDistance from the right contour of the left eyebrow to the left eyersquosPrime

right eyebrow height PrimeDistance from the left contour of the right eyebrow to the right eyersquosPrime

eyebrows height left eyebrow height + right eyebrow height

left eyebrow width + right eyebrow width1113888 1113889 minus 041113888 1113889lowast 250

(5)

In this way we determined the values for the sample AUsfrom the expression pictures needed to calculate the five-dimensional values the closure of the mouth corresponds tomouth_open the curvature of the mouth corners corre-sponds tomouth_rad the closure of the eyes corresponds toeyes_open the distance between the eyebrows correspondsto eyebrows_distance and the movement of eyebrows up anddown corresponds to eyebrows_height

221 Correlation Analysis We conducted a Pearson corre-lation analysis on the five variables obtained -e overall cor-relation coefficient is represented by ρ in a populationGenerally ρ is unknown so we needed to estimate ρ ap-proximately through a sample correlation coefficient r Becauseof the different values within the individual samples the valuesof r varied -erefore we needed to examine the suitability ofthe sample correlation coefficient and carry out a significancetest on it By using SPSS we calculated the sample AUs from theKDEF database with the formulas that were described-en weconducted the Pearson correlation analysis with the five

variables (ie the dimensions of the corresponding emotionsobtained) After that we usedmin-max normalization which isa commonly used normalization method and calculated thecorrelation coefficients of the five simple AUs -ey could thenbe used as feature weights when we calculated the five-di-mensional values -e comparison results of the correlationcoefficients of the five sample AUs are shown in Table 4

We calculated the corresponding five-dimensionalvalues of the images from the KDEF database by applying theobtained coefficients

222 Multilayer Perceptron A multilayer perceptron cansolve most linear nonseparable problems In this part of thestudy we used a multilayer perceptron which was com-posed of five layers of neurons -is network included aninput layer hidden layers and an output layer and it usedgradient clipping and dropout methods to regularize

In the multilayer perceptron we used a softmax acti-vation function and a cross-entropy loss function Weconsidered the five-dimensional data of an expression

Figure 1 Facial feature points of the KDEF expressions

Mobile Information Systems 5

picture as a group and therefore had 980 groups of data Wedivided this data into 880 groups for training and theremaining 100 groups were used as test data

After 5180104 iterations of training the accuracy of themodel achieved is shown in Table 5

In order to verify the accuracy of the model we alsoinvited 12 volunteers of different ages to identify the same100 randomly selected facial images and calculated theiraccuracy At the same time we let the model identify the 100facial images -e final results are shown in Table 6

-e results show that the accuracy of the model rec-ognition after training was higher than that of humanvolunteers

3 Comparison between the Five-DimensionalEmotional Model and the PAD EmotionalEvaluation Model

31 PAD Emotional Evaluation Model -e PAD emotionalevaluation model is a model based on emotional dimensiontheory According to the evaluation method of semanticdifferences Mehrabian and Russell defined the three di-mensions of emotion as ldquoPrdquo for pleasurendashdispleasure ldquoArdquo forarousalndashnonarousal and ldquoDrdquo for dominancendashsubmissive-ness [2ndash4] Among them P represents the positive ornegative degree of an individualrsquos emotional state A rep-resents the physiological activation level of an individualrsquosnerves indicating that the individual is excited or depressedand D represents the subjective control state an active orpassive state of the individual to the situation and others

-e PAD emotional evaluation model as the most fa-mousmodel in emotional dimension theory has been widelyused in emotional research In this part of the study we

tested the applicability of the five-dimensional emotionmodel by comparing it with the PAD emotional model

32 Contrast Experiment of Accuracy under the SameConditions For the PAD emotional model we used thesame sample and multilayer perceptron with the same inputlayer and hidden layers as the five-dimensional emotionalmodel for training

Lee et al at the Institute of Psychology ChineseAcademy of Sciences compiled the Chinese version of thePAD emotion scales and evaluated the PAD values of the 14basic emotions -is included basic emotions proposed bydifferent researchers such as happiness anger relaxationand fear [25] In the current research we used PAD valuescorresponding to seven basic emotions from the KDEFdatabase to analyze the correlation of the results -e PADvalues of the seven emotions are shown in Table 7

We analyzed the Pearson correlations between the PADvalues of emotions and the five sample AUs and took themin-max normalized coefficients as the feature weights ofthe PAD emotional evaluation model -e coefficients ob-tained are shown in Table 8

Based on these parameters we transformed the sampleAUs of the expression images of the KDEF database into thecorresponding PAD values and then trained the multilayerperceptron with the data After 6000000 iterations oftraining we were able to compare the accuracies of the PADemotional model and the five-dimensional emotion model-e final results are shown in Table 9

Based on this result we felt that the five-dimensionalemotion model was better than the PAD emotional evalu-ation model when applied to emotion recognition -ecomparison results are graphically shown in Figure 2

Table 4 Coefficients of the sample AUs corresponding to the five dimensions

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceIntensity 1000 0657 0000 0569 0341Pleasure 0325 1000 0156 0000 0346Control 0195 1000 0169 0000 0618Certainty 0000 0733 0609 0224 1000Tension 0648 0000 0755 1000 0163

Table 5 Model accuracy

Whole data Training set sampling data Test data078 079 073

Table 6 Model and volunteersrsquo recognition accuracy

ModelVolunteers

X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X074 054 064 066 061 058 068 067 069 067 066 061 068 064

6 Mobile Information Systems

4 Comparison between the Five-DimensionalEmotional Model and the MotivationalDimensional Model

-e PAD model proposes the existence of a dominancedimension in addition to the valence and arousal dimen-sions Lee [25] evaluated the values of the basic emotions onthe PAD dimension Since the three dimensions of the PADmodel are independent of each other we can eliminate theindividual dominance dimension and keep only the values ofpleasure and arousal for the valence and arousal dimensions

-e motivational dimensional model of affect points tothe existence of a motivational dimension in addition to thevalence and arousal dimensions Xia [26] designed experi-ments to assess the motivational dimension of basic emo-tions on a scale of 1ndash9 By normalizing his experimentalresults with Leersquos we can map them to a range of minus4 to 4 Wecan obtain a numerical representation of basic emotions onthe motivational dimension model which is shown inTable 10

We analyzed the Pearson correlations between themotivational dimension value and the five sample AUs andtook the min-max normalized coefficients as the featureweights of the motivational dimensional model -e coef-ficients obtained are shown in Table 11

Based on these parameters we transformed the sampleAUs of the expression images of the KDEF database into thecorresponding motivational dimension value and thentrained the multilayer perceptron with the data After2742366 iterations of training we were able to compare theaccuracy of the motivational dimensional model and thefive-dimensional emotion model -e final results are shownin Table 12

5 Application of the Five-DimensionalEmotion Model

Human emotion has a wide range of application prospectsand potential economic value in man-machine interactionelectronic education robotics entertainment the medicalfield and other areas So it has attracted a great deal ofattention in academia and industry In both arenas a largenumber of related projects about human emotion have beencarried out Some of the companies that have been interestedand involved in this research have been IBM in the UnitedStates Sony in Japan Philips in Europe and many othersChina has gradually attached importance to this kind ofresearch where it has been explored at the Beijing Universityof Science and Technology the Institute of Automation ofthe Chinese Academy of Sciences and others In a fewwords research on the five-dimensional emotion model as

Table 7 PAD values corresponding to seven emotions

EmotionAverage value

P A DHappiness 277 121 142Anger minus198 110 060Sad minus089 017 minus070Fear minus093 130 minus064Surprise 172 171 022Disgust minus180 040 067Calm minus053 minus125 minus084

Table 8 Coefficient of PAD values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceP 1000 0968 0488 0190 0000A 1000 0000 0293 0475 0150D 0930 1000 0000 0254 0635

Table 9 Comparison of accuracy between PAD and the five-dimensional emotion model

PAD Five-dimensional emotion modelWhole data 069 078Training set samplingdata 066 079

Test data 070 073

Whole data Training set sampling data Test data

090807060504030201

0

DifferenceFive-dimensional modelPAD model

Figure 2 Comparison of accuracy between PAD and the five-dimensional emotion model expressions

Mobile Information Systems 7

applied in emotion recognition has far-reaching scientificsignificance and extensive application value To this end wetook the adaptive man-machine interactive system based onthe five-dimensional emotion model as an example to il-lustrate its practical application

Man-machine interaction is a subject that studies thedesign evaluation and implementation of interactive com-puting systems for the convenience of human users Anadaptive interface is the mainstream of man-machine inter-action research and application Now with the development ofartificial intelligence emotion is more and more important inthe research of an adaptive interface-rough the introductionof the five-dimensional emotional model the man-machineinteractive interface can observe a userrsquos emotions so as toprovide a more user-friendly user experience

For a general user interface color and backgroundmusicare part of the core content Based on this we designed thestyle of our user interface to change when users were ex-periencing different emotions

Before we designed it we needed to make some adaptiveimprovements to the five-dimensional emotion modelBecause different people have different facial features be-havioral habits and uncontrollable expression changes thiswill lead to recognition errors if the model worked accordingto only one frame of expression while performing a con-tinuous emotional judgment We needed to make someimprovements to the model that is to prepare a queue oflength N each element of which was a final result of eachemotion detection When the queue was full the modelwould perform an enqueue or dequeue operation From thiswe would receive a userrsquos continuous emotion detectionresults N times Because emotion is a continuously changing

process this method would receive more accurate results byanalyzing the emotions continuously

According to Zhang Congcongrsquos experiment [27] inclassic orchestra Bachrsquos evaluation was significantly higherthan others his music in aminor key had P andD values thatwere significantly higher than the music in a major key andthe PAD values of his faster music were significantly higherthan those of his slower music In terms of color the A valueof longer wave color was higher than that of shorter wavesfor multicolor the PAD values of saturated and bright colorswere higher than those of soft and dark colors as a whole-ere was a nonlinear relationship between the D value andbrightness but in black and white the D value of black wassignificantly higher than that of white and gray

Based on the results of his study we captured a userrsquosexpression by accessing the userrsquos camera and then used thefive-dimensional emotion model to detect the userrsquos con-tinuous emotions and calculate the average value of thedimensions For this we could determine which dimensionsneeded to be strengthened most Finally through colorgradient animation music smooth switching and otherapplications of JavaScript the model displayed the interfacestyle and background music that could improve the requireddimension value and guide the userrsquos emotion in a positivedirection

For example when the user was in a low-pleasure andlow-control mood we could use high-saturation blue as themain style and Bachrsquos fast minor-keyed Orchestral Suite No 2in B Minor BWV 1067 VII Badinerie as the backgroundmusic to improve the userrsquos pleasure and control When theuser was in a high pleasure mood we could use high-satu-ration red as the main style and Bachrsquos fast major-keyed

Table 11 Coefficient of the motivational dimension values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceValence 0714 1000 0354 0136 0000Arousal 1000 0329 0200 0432 0000Motivational 0309 1000 0183 0000 0331

Table 12 Comparison of accuracy between the motivational dimensional model and the five-dimensional emotion model

Motivational dimensional model Five-dimensional emotion modelComplete data 071 078Training set sampling data 068 079Test data 071 073

Table 10 -e motivational dimension values corresponding to seven emotions

EmotionAverage value

Valence Arousal MotivationalHappiness 277 121 134Anger minus198 110 minus059Sad minus089 017 minus096Fear minus093 130 minus143Surprise 172 171 minus032Disgust minus180 040 minus136Calm minus053 minus125 minus003

8 Mobile Information Systems

Brandenburg Concerto No 2 in F BWV1047I [Allegro] asbackground music to improve the userrsquos intensity and keepthe user happy for a longer time

6 Conclusion

Artificial intelligence has played a promising and productiverole in many fields but an AI that cannot distinguish emotioncan only exist as a machine and loses much of the humanexperience -erefore we constructed the five-dimensionalemotion model in this paper and made research and ex-ploration in the field of emotion recognition -e highlightsare as follows

(i) We determined five dimensions of the proposedemotion model as intensity pleasure control cer-tainty and tension and simplified the AUs of theFACS Based on the KDEF database we determinedthe feature weights of the five dimensions corre-sponding to the sample AUs and used a multilayerperceptron to train our model with higher accuracythan recognition by human volunteers

(ii) We compared the five-dimensional emotion modelwith the traditional PAD emotional evaluationmodel under the same conditions and demonstratedthat our proposed model was better in terms ofaccuracy

(iii) Finally we discussed the practical application of thefive-dimensional emotion model with the illustra-tion of a prototype emotion-detecting interface asan example

Data Availability

-e data used in the study are available in Karolinska Di-rected Emotional Faces (KDEF) httpswwwkdefseindexhtml

Conflicts of Interest

-e authors declare that there are no conflicts of interestregarding the publication of this paper

References

[1] W James Functional Psychology Hubei Science amp Tech-nology Press Wuhan China 2016

[2] A Mehrabian ldquoFramework for a comprehensive descriptionand measurement of emotional statesrdquo Genetic Social andGeneral Psychology Monographs vol 121 no 3 pp 339ndash3611995

[3] A Mehrabian ldquoPleasure-arousal-dominance a generalframework for describing and measuring individual differ-ences in temperamentrdquo Current Psychology vol 14 no 4pp 261ndash292 1996

[4] A Mehrabian C Wihardja and E Ljunggren ldquoEmotionalcorrelates of preferences for situation-activity combinationsin everyday liferdquo Genetic Social and General PsychologyMonographs vol 123 no 4 pp 461ndash477 1997

[5] N H Frijda 9e Emotions pp 62ndash65 Cambridge UniversityPress Cambridge UK 1986

[6] C E Izard Human Emotions Springer New York NY USA1977

[7] L Wang ldquoResearch on Agent emotion model based on theOCC modelrdquo Microcomputer Information vol 23 2007

[8] J Han ldquoCognitive emotionmodel for eldercare robot in smarthomerdquo China Communications vol 12 no 4 pp 32ndash41 2015

[9] Z Mi ldquoResearch on dimension emotion recognition modelbased on ConvLSTM networkrdquo Computer Engineering andApplications 2020

[10] S Wang ldquoPerceptual image retrieval based on emotionmodelrdquo Journal of Circuits and Systems vol 8 no 6 2003

[11] C Fu C Liu C T Ishi and H Ishiguro ldquoMulti-modalityemotion recognition model with GAT-based multi-head in-ter-modality attentionrdquo Sensors vol 20 no 17 p 4894 2020

[12] H Cai ldquoEmotion classification model based on word em-bedding and CNNrdquo Application Research of Computersvol 33 no 10 2016

[13] P A Gable and E Harmon-Jones ldquoApproach-motivatedpositive affect reduces breadth of attentionrdquo PsychologicalScience vol 19 no 5 pp 476ndash482 2008

[14] P Gable and E Harmon-Jones ldquo-e blues broaden but thenasty narrowsrdquo Psychological Science vol 21 no 2pp 211ndash215 2010

[15] P A Gable and E Harmon-Jones ldquo-e effect of low versushigh approach-motivated positive affect on memory for pe-ripherally versus centrally presented informationrdquo Emotionvol 10 no 4 pp 599ndash603 2010

[16] P Gable and E Harmon-Jones ldquo-e motivational dimen-sional model of affect implications for breadth of attentionmemory and cognitive categorisationrdquo Cognition amp Emotionvol 24 no 2 pp 322ndash337 2010

[17] P Ekman W V Friesen and S S Tomkins9e Facial AffectScoring Technique (FAST) Consulting Psychologists PressPalo Alto CA USA 1971

[18] C A Smith and P C Ellsworth ldquoPatterns of cognitive ap-praisal in emotionrdquo Journal of Personality and Social Psy-chology vol 48 no 4 pp 813ndash838 1985

[19] J W Kalat and M N Shiota Emotions China Light IndustryPress Beijing China 2009

[20] Z Meng ldquoA preliminary attempt to determine the facialexpression patterns of babiesrdquo Acta Psychologica Sinica vol 1pp 55ndash61 1985

[21] C E Izard ldquoEmotions as motivationsrdquo in Nebraska Sym-posium on Motivation pp 163ndash200 University of NebraskaPress Lincoln NE USA 1978

[22] Z Meng ldquoWhy facial expressions can be used as objectiveindicators of emotion researchrdquo Acta Psychologica Sinicavol 2 pp 124ndash128 1987

[23] P Ekman W V Friesen and S S Tomkins9e Facial ActionCoding System (FACS) Consulting Psychologists Press PaloAlto CA USA 1978

[24] D Lundqvist A Flykt and A Ohman 9e Karolinska Di-rected Emotional Faces-KDEF Karolinska Institute SolnaSweden 1998 httpswwwkdefseindexhtml

Mobile Information Systems 9

[25] X Lee PAD 9ree-Dimensional Model China ComputerWorld Beijing China 2007

[26] M XiaConvergence and Avoidance of Efficacy andMotivationDimensions in Emotion Recognition and Efficacy PreferenceEffects Southwest University Chongqing China 2015

[27] C Zhang Emotional Relationship between Music and ColorEast China Normal University Shanghai China 2014

10 Mobile Information Systems

Page 4: TheStudyofaFive-DimensionalEmotionalModelforFacial ... · pictureasagroupandthereforehad980groupsofdata.We dividedthisdatainto880groupsfortraining,andthe remaining100groupswereusedastestdata

For the closure of the eyes the following equations wereused

left eye width PrimeLeft eye width

Prime

left eye height PrimeDistance from the left eyelid buttom to the top

left eye open left eye height

left eye width

right eye width PrimeRight eye widthPrime

right eye height PrimeDistance from the right eyelid buttom to the topPrime

right eye open right eye height

right eye width

eyes open (left eye open + right eye open) lowast 100

(3)

For the movement of the eyebrows up and down and thedistance between the eyebrows the following equations wereused

left eyebrow width PrimeLeft eyebrow widthPrime

right eyebrow width PrimeRight eyebrow widthPrime

eyebrows center PrimeY minus axis position of between the eyebrowsPrime

eyebrows distance eyebrows centerlowast 2

left eyebrow width + right eyebrow width1113888 1113889 minus 071113888 1113889lowast 500

(4)

Table 3 Single activity unit (AU) of FACS (except from P Ekman and W V Friesen ldquoMeasuring Facial Movementrdquo EnvironmentalPsychology and Nonverbal Behavior vol 1 pp 56ndash75 1976)

AU number Description Facial muscle1 Inner brow raiser Frontalis (pars medialis)2 Outer brow raiser (unilateral right side) Frontalis (pars lateralis)4 Brow lowered Depressor glabellae depressor supercilii and corrugator supercilii5 Upper lid raiser Levator palpebrae superioris6 Cheek raiser Orbicularis oculi (pars orbitalis)7 Lid tightener Orbicularis oculi (pars palpebralis)10 Upper lip raiser Levator labii superioris and caput infraorbitalis12 Lip corner puller Zygomatic major14 Dimpler Buccinator15 Lip corner depressor Depressor anguli oris (triangularis)16 Lower lip depressor Depressor labii inferioris25 Lips part Depressor labii or relaxation of mentalis (AU17) or orbicularis oris26 Jaw drop Masseter temporalis and internal pterygoid relaxed27 Mouth stretch Pterygoids and digastric28 Lip suck Orbicularis oris

4 Mobile Information Systems

-e operation here is that we used the data of individualswith extreme results to calculate the maximum and mini-mum values of the distance between the eyebrows and then

mapped this value to the range of 0ndash100 -e followingsimilar operations are the same as this one

left eyebrow height PrimeDistance from the right contour of the left eyebrow to the left eyersquosPrime

right eyebrow height PrimeDistance from the left contour of the right eyebrow to the right eyersquosPrime

eyebrows height left eyebrow height + right eyebrow height

left eyebrow width + right eyebrow width1113888 1113889 minus 041113888 1113889lowast 250

(5)

In this way we determined the values for the sample AUsfrom the expression pictures needed to calculate the five-dimensional values the closure of the mouth corresponds tomouth_open the curvature of the mouth corners corre-sponds tomouth_rad the closure of the eyes corresponds toeyes_open the distance between the eyebrows correspondsto eyebrows_distance and the movement of eyebrows up anddown corresponds to eyebrows_height

221 Correlation Analysis We conducted a Pearson corre-lation analysis on the five variables obtained -e overall cor-relation coefficient is represented by ρ in a populationGenerally ρ is unknown so we needed to estimate ρ ap-proximately through a sample correlation coefficient r Becauseof the different values within the individual samples the valuesof r varied -erefore we needed to examine the suitability ofthe sample correlation coefficient and carry out a significancetest on it By using SPSS we calculated the sample AUs from theKDEF database with the formulas that were described-en weconducted the Pearson correlation analysis with the five

variables (ie the dimensions of the corresponding emotionsobtained) After that we usedmin-max normalization which isa commonly used normalization method and calculated thecorrelation coefficients of the five simple AUs -ey could thenbe used as feature weights when we calculated the five-di-mensional values -e comparison results of the correlationcoefficients of the five sample AUs are shown in Table 4

We calculated the corresponding five-dimensionalvalues of the images from the KDEF database by applying theobtained coefficients

222 Multilayer Perceptron A multilayer perceptron cansolve most linear nonseparable problems In this part of thestudy we used a multilayer perceptron which was com-posed of five layers of neurons -is network included aninput layer hidden layers and an output layer and it usedgradient clipping and dropout methods to regularize

In the multilayer perceptron we used a softmax acti-vation function and a cross-entropy loss function Weconsidered the five-dimensional data of an expression

Figure 1 Facial feature points of the KDEF expressions

Mobile Information Systems 5

picture as a group and therefore had 980 groups of data Wedivided this data into 880 groups for training and theremaining 100 groups were used as test data

After 5180104 iterations of training the accuracy of themodel achieved is shown in Table 5

In order to verify the accuracy of the model we alsoinvited 12 volunteers of different ages to identify the same100 randomly selected facial images and calculated theiraccuracy At the same time we let the model identify the 100facial images -e final results are shown in Table 6

-e results show that the accuracy of the model rec-ognition after training was higher than that of humanvolunteers

3 Comparison between the Five-DimensionalEmotional Model and the PAD EmotionalEvaluation Model

31 PAD Emotional Evaluation Model -e PAD emotionalevaluation model is a model based on emotional dimensiontheory According to the evaluation method of semanticdifferences Mehrabian and Russell defined the three di-mensions of emotion as ldquoPrdquo for pleasurendashdispleasure ldquoArdquo forarousalndashnonarousal and ldquoDrdquo for dominancendashsubmissive-ness [2ndash4] Among them P represents the positive ornegative degree of an individualrsquos emotional state A rep-resents the physiological activation level of an individualrsquosnerves indicating that the individual is excited or depressedand D represents the subjective control state an active orpassive state of the individual to the situation and others

-e PAD emotional evaluation model as the most fa-mousmodel in emotional dimension theory has been widelyused in emotional research In this part of the study we

tested the applicability of the five-dimensional emotionmodel by comparing it with the PAD emotional model

32 Contrast Experiment of Accuracy under the SameConditions For the PAD emotional model we used thesame sample and multilayer perceptron with the same inputlayer and hidden layers as the five-dimensional emotionalmodel for training

Lee et al at the Institute of Psychology ChineseAcademy of Sciences compiled the Chinese version of thePAD emotion scales and evaluated the PAD values of the 14basic emotions -is included basic emotions proposed bydifferent researchers such as happiness anger relaxationand fear [25] In the current research we used PAD valuescorresponding to seven basic emotions from the KDEFdatabase to analyze the correlation of the results -e PADvalues of the seven emotions are shown in Table 7

We analyzed the Pearson correlations between the PADvalues of emotions and the five sample AUs and took themin-max normalized coefficients as the feature weights ofthe PAD emotional evaluation model -e coefficients ob-tained are shown in Table 8

Based on these parameters we transformed the sampleAUs of the expression images of the KDEF database into thecorresponding PAD values and then trained the multilayerperceptron with the data After 6000000 iterations oftraining we were able to compare the accuracies of the PADemotional model and the five-dimensional emotion model-e final results are shown in Table 9

Based on this result we felt that the five-dimensionalemotion model was better than the PAD emotional evalu-ation model when applied to emotion recognition -ecomparison results are graphically shown in Figure 2

Table 4 Coefficients of the sample AUs corresponding to the five dimensions

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceIntensity 1000 0657 0000 0569 0341Pleasure 0325 1000 0156 0000 0346Control 0195 1000 0169 0000 0618Certainty 0000 0733 0609 0224 1000Tension 0648 0000 0755 1000 0163

Table 5 Model accuracy

Whole data Training set sampling data Test data078 079 073

Table 6 Model and volunteersrsquo recognition accuracy

ModelVolunteers

X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X074 054 064 066 061 058 068 067 069 067 066 061 068 064

6 Mobile Information Systems

4 Comparison between the Five-DimensionalEmotional Model and the MotivationalDimensional Model

-e PAD model proposes the existence of a dominancedimension in addition to the valence and arousal dimen-sions Lee [25] evaluated the values of the basic emotions onthe PAD dimension Since the three dimensions of the PADmodel are independent of each other we can eliminate theindividual dominance dimension and keep only the values ofpleasure and arousal for the valence and arousal dimensions

-e motivational dimensional model of affect points tothe existence of a motivational dimension in addition to thevalence and arousal dimensions Xia [26] designed experi-ments to assess the motivational dimension of basic emo-tions on a scale of 1ndash9 By normalizing his experimentalresults with Leersquos we can map them to a range of minus4 to 4 Wecan obtain a numerical representation of basic emotions onthe motivational dimension model which is shown inTable 10

We analyzed the Pearson correlations between themotivational dimension value and the five sample AUs andtook the min-max normalized coefficients as the featureweights of the motivational dimensional model -e coef-ficients obtained are shown in Table 11

Based on these parameters we transformed the sampleAUs of the expression images of the KDEF database into thecorresponding motivational dimension value and thentrained the multilayer perceptron with the data After2742366 iterations of training we were able to compare theaccuracy of the motivational dimensional model and thefive-dimensional emotion model -e final results are shownin Table 12

5 Application of the Five-DimensionalEmotion Model

Human emotion has a wide range of application prospectsand potential economic value in man-machine interactionelectronic education robotics entertainment the medicalfield and other areas So it has attracted a great deal ofattention in academia and industry In both arenas a largenumber of related projects about human emotion have beencarried out Some of the companies that have been interestedand involved in this research have been IBM in the UnitedStates Sony in Japan Philips in Europe and many othersChina has gradually attached importance to this kind ofresearch where it has been explored at the Beijing Universityof Science and Technology the Institute of Automation ofthe Chinese Academy of Sciences and others In a fewwords research on the five-dimensional emotion model as

Table 7 PAD values corresponding to seven emotions

EmotionAverage value

P A DHappiness 277 121 142Anger minus198 110 060Sad minus089 017 minus070Fear minus093 130 minus064Surprise 172 171 022Disgust minus180 040 067Calm minus053 minus125 minus084

Table 8 Coefficient of PAD values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceP 1000 0968 0488 0190 0000A 1000 0000 0293 0475 0150D 0930 1000 0000 0254 0635

Table 9 Comparison of accuracy between PAD and the five-dimensional emotion model

PAD Five-dimensional emotion modelWhole data 069 078Training set samplingdata 066 079

Test data 070 073

Whole data Training set sampling data Test data

090807060504030201

0

DifferenceFive-dimensional modelPAD model

Figure 2 Comparison of accuracy between PAD and the five-dimensional emotion model expressions

Mobile Information Systems 7

applied in emotion recognition has far-reaching scientificsignificance and extensive application value To this end wetook the adaptive man-machine interactive system based onthe five-dimensional emotion model as an example to il-lustrate its practical application

Man-machine interaction is a subject that studies thedesign evaluation and implementation of interactive com-puting systems for the convenience of human users Anadaptive interface is the mainstream of man-machine inter-action research and application Now with the development ofartificial intelligence emotion is more and more important inthe research of an adaptive interface-rough the introductionof the five-dimensional emotional model the man-machineinteractive interface can observe a userrsquos emotions so as toprovide a more user-friendly user experience

For a general user interface color and backgroundmusicare part of the core content Based on this we designed thestyle of our user interface to change when users were ex-periencing different emotions

Before we designed it we needed to make some adaptiveimprovements to the five-dimensional emotion modelBecause different people have different facial features be-havioral habits and uncontrollable expression changes thiswill lead to recognition errors if the model worked accordingto only one frame of expression while performing a con-tinuous emotional judgment We needed to make someimprovements to the model that is to prepare a queue oflength N each element of which was a final result of eachemotion detection When the queue was full the modelwould perform an enqueue or dequeue operation From thiswe would receive a userrsquos continuous emotion detectionresults N times Because emotion is a continuously changing

process this method would receive more accurate results byanalyzing the emotions continuously

According to Zhang Congcongrsquos experiment [27] inclassic orchestra Bachrsquos evaluation was significantly higherthan others his music in aminor key had P andD values thatwere significantly higher than the music in a major key andthe PAD values of his faster music were significantly higherthan those of his slower music In terms of color the A valueof longer wave color was higher than that of shorter wavesfor multicolor the PAD values of saturated and bright colorswere higher than those of soft and dark colors as a whole-ere was a nonlinear relationship between the D value andbrightness but in black and white the D value of black wassignificantly higher than that of white and gray

Based on the results of his study we captured a userrsquosexpression by accessing the userrsquos camera and then used thefive-dimensional emotion model to detect the userrsquos con-tinuous emotions and calculate the average value of thedimensions For this we could determine which dimensionsneeded to be strengthened most Finally through colorgradient animation music smooth switching and otherapplications of JavaScript the model displayed the interfacestyle and background music that could improve the requireddimension value and guide the userrsquos emotion in a positivedirection

For example when the user was in a low-pleasure andlow-control mood we could use high-saturation blue as themain style and Bachrsquos fast minor-keyed Orchestral Suite No 2in B Minor BWV 1067 VII Badinerie as the backgroundmusic to improve the userrsquos pleasure and control When theuser was in a high pleasure mood we could use high-satu-ration red as the main style and Bachrsquos fast major-keyed

Table 11 Coefficient of the motivational dimension values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceValence 0714 1000 0354 0136 0000Arousal 1000 0329 0200 0432 0000Motivational 0309 1000 0183 0000 0331

Table 12 Comparison of accuracy between the motivational dimensional model and the five-dimensional emotion model

Motivational dimensional model Five-dimensional emotion modelComplete data 071 078Training set sampling data 068 079Test data 071 073

Table 10 -e motivational dimension values corresponding to seven emotions

EmotionAverage value

Valence Arousal MotivationalHappiness 277 121 134Anger minus198 110 minus059Sad minus089 017 minus096Fear minus093 130 minus143Surprise 172 171 minus032Disgust minus180 040 minus136Calm minus053 minus125 minus003

8 Mobile Information Systems

Brandenburg Concerto No 2 in F BWV1047I [Allegro] asbackground music to improve the userrsquos intensity and keepthe user happy for a longer time

6 Conclusion

Artificial intelligence has played a promising and productiverole in many fields but an AI that cannot distinguish emotioncan only exist as a machine and loses much of the humanexperience -erefore we constructed the five-dimensionalemotion model in this paper and made research and ex-ploration in the field of emotion recognition -e highlightsare as follows

(i) We determined five dimensions of the proposedemotion model as intensity pleasure control cer-tainty and tension and simplified the AUs of theFACS Based on the KDEF database we determinedthe feature weights of the five dimensions corre-sponding to the sample AUs and used a multilayerperceptron to train our model with higher accuracythan recognition by human volunteers

(ii) We compared the five-dimensional emotion modelwith the traditional PAD emotional evaluationmodel under the same conditions and demonstratedthat our proposed model was better in terms ofaccuracy

(iii) Finally we discussed the practical application of thefive-dimensional emotion model with the illustra-tion of a prototype emotion-detecting interface asan example

Data Availability

-e data used in the study are available in Karolinska Di-rected Emotional Faces (KDEF) httpswwwkdefseindexhtml

Conflicts of Interest

-e authors declare that there are no conflicts of interestregarding the publication of this paper

References

[1] W James Functional Psychology Hubei Science amp Tech-nology Press Wuhan China 2016

[2] A Mehrabian ldquoFramework for a comprehensive descriptionand measurement of emotional statesrdquo Genetic Social andGeneral Psychology Monographs vol 121 no 3 pp 339ndash3611995

[3] A Mehrabian ldquoPleasure-arousal-dominance a generalframework for describing and measuring individual differ-ences in temperamentrdquo Current Psychology vol 14 no 4pp 261ndash292 1996

[4] A Mehrabian C Wihardja and E Ljunggren ldquoEmotionalcorrelates of preferences for situation-activity combinationsin everyday liferdquo Genetic Social and General PsychologyMonographs vol 123 no 4 pp 461ndash477 1997

[5] N H Frijda 9e Emotions pp 62ndash65 Cambridge UniversityPress Cambridge UK 1986

[6] C E Izard Human Emotions Springer New York NY USA1977

[7] L Wang ldquoResearch on Agent emotion model based on theOCC modelrdquo Microcomputer Information vol 23 2007

[8] J Han ldquoCognitive emotionmodel for eldercare robot in smarthomerdquo China Communications vol 12 no 4 pp 32ndash41 2015

[9] Z Mi ldquoResearch on dimension emotion recognition modelbased on ConvLSTM networkrdquo Computer Engineering andApplications 2020

[10] S Wang ldquoPerceptual image retrieval based on emotionmodelrdquo Journal of Circuits and Systems vol 8 no 6 2003

[11] C Fu C Liu C T Ishi and H Ishiguro ldquoMulti-modalityemotion recognition model with GAT-based multi-head in-ter-modality attentionrdquo Sensors vol 20 no 17 p 4894 2020

[12] H Cai ldquoEmotion classification model based on word em-bedding and CNNrdquo Application Research of Computersvol 33 no 10 2016

[13] P A Gable and E Harmon-Jones ldquoApproach-motivatedpositive affect reduces breadth of attentionrdquo PsychologicalScience vol 19 no 5 pp 476ndash482 2008

[14] P Gable and E Harmon-Jones ldquo-e blues broaden but thenasty narrowsrdquo Psychological Science vol 21 no 2pp 211ndash215 2010

[15] P A Gable and E Harmon-Jones ldquo-e effect of low versushigh approach-motivated positive affect on memory for pe-ripherally versus centrally presented informationrdquo Emotionvol 10 no 4 pp 599ndash603 2010

[16] P Gable and E Harmon-Jones ldquo-e motivational dimen-sional model of affect implications for breadth of attentionmemory and cognitive categorisationrdquo Cognition amp Emotionvol 24 no 2 pp 322ndash337 2010

[17] P Ekman W V Friesen and S S Tomkins9e Facial AffectScoring Technique (FAST) Consulting Psychologists PressPalo Alto CA USA 1971

[18] C A Smith and P C Ellsworth ldquoPatterns of cognitive ap-praisal in emotionrdquo Journal of Personality and Social Psy-chology vol 48 no 4 pp 813ndash838 1985

[19] J W Kalat and M N Shiota Emotions China Light IndustryPress Beijing China 2009

[20] Z Meng ldquoA preliminary attempt to determine the facialexpression patterns of babiesrdquo Acta Psychologica Sinica vol 1pp 55ndash61 1985

[21] C E Izard ldquoEmotions as motivationsrdquo in Nebraska Sym-posium on Motivation pp 163ndash200 University of NebraskaPress Lincoln NE USA 1978

[22] Z Meng ldquoWhy facial expressions can be used as objectiveindicators of emotion researchrdquo Acta Psychologica Sinicavol 2 pp 124ndash128 1987

[23] P Ekman W V Friesen and S S Tomkins9e Facial ActionCoding System (FACS) Consulting Psychologists Press PaloAlto CA USA 1978

[24] D Lundqvist A Flykt and A Ohman 9e Karolinska Di-rected Emotional Faces-KDEF Karolinska Institute SolnaSweden 1998 httpswwwkdefseindexhtml

Mobile Information Systems 9

[25] X Lee PAD 9ree-Dimensional Model China ComputerWorld Beijing China 2007

[26] M XiaConvergence and Avoidance of Efficacy andMotivationDimensions in Emotion Recognition and Efficacy PreferenceEffects Southwest University Chongqing China 2015

[27] C Zhang Emotional Relationship between Music and ColorEast China Normal University Shanghai China 2014

10 Mobile Information Systems

Page 5: TheStudyofaFive-DimensionalEmotionalModelforFacial ... · pictureasagroupandthereforehad980groupsofdata.We dividedthisdatainto880groupsfortraining,andthe remaining100groupswereusedastestdata

-e operation here is that we used the data of individualswith extreme results to calculate the maximum and mini-mum values of the distance between the eyebrows and then

mapped this value to the range of 0ndash100 -e followingsimilar operations are the same as this one

left eyebrow height PrimeDistance from the right contour of the left eyebrow to the left eyersquosPrime

right eyebrow height PrimeDistance from the left contour of the right eyebrow to the right eyersquosPrime

eyebrows height left eyebrow height + right eyebrow height

left eyebrow width + right eyebrow width1113888 1113889 minus 041113888 1113889lowast 250

(5)

In this way we determined the values for the sample AUsfrom the expression pictures needed to calculate the five-dimensional values the closure of the mouth corresponds tomouth_open the curvature of the mouth corners corre-sponds tomouth_rad the closure of the eyes corresponds toeyes_open the distance between the eyebrows correspondsto eyebrows_distance and the movement of eyebrows up anddown corresponds to eyebrows_height

221 Correlation Analysis We conducted a Pearson corre-lation analysis on the five variables obtained -e overall cor-relation coefficient is represented by ρ in a populationGenerally ρ is unknown so we needed to estimate ρ ap-proximately through a sample correlation coefficient r Becauseof the different values within the individual samples the valuesof r varied -erefore we needed to examine the suitability ofthe sample correlation coefficient and carry out a significancetest on it By using SPSS we calculated the sample AUs from theKDEF database with the formulas that were described-en weconducted the Pearson correlation analysis with the five

variables (ie the dimensions of the corresponding emotionsobtained) After that we usedmin-max normalization which isa commonly used normalization method and calculated thecorrelation coefficients of the five simple AUs -ey could thenbe used as feature weights when we calculated the five-di-mensional values -e comparison results of the correlationcoefficients of the five sample AUs are shown in Table 4

We calculated the corresponding five-dimensionalvalues of the images from the KDEF database by applying theobtained coefficients

222 Multilayer Perceptron A multilayer perceptron cansolve most linear nonseparable problems In this part of thestudy we used a multilayer perceptron which was com-posed of five layers of neurons -is network included aninput layer hidden layers and an output layer and it usedgradient clipping and dropout methods to regularize

In the multilayer perceptron we used a softmax acti-vation function and a cross-entropy loss function Weconsidered the five-dimensional data of an expression

Figure 1 Facial feature points of the KDEF expressions

Mobile Information Systems 5

picture as a group and therefore had 980 groups of data Wedivided this data into 880 groups for training and theremaining 100 groups were used as test data

After 5180104 iterations of training the accuracy of themodel achieved is shown in Table 5

In order to verify the accuracy of the model we alsoinvited 12 volunteers of different ages to identify the same100 randomly selected facial images and calculated theiraccuracy At the same time we let the model identify the 100facial images -e final results are shown in Table 6

-e results show that the accuracy of the model rec-ognition after training was higher than that of humanvolunteers

3 Comparison between the Five-DimensionalEmotional Model and the PAD EmotionalEvaluation Model

31 PAD Emotional Evaluation Model -e PAD emotionalevaluation model is a model based on emotional dimensiontheory According to the evaluation method of semanticdifferences Mehrabian and Russell defined the three di-mensions of emotion as ldquoPrdquo for pleasurendashdispleasure ldquoArdquo forarousalndashnonarousal and ldquoDrdquo for dominancendashsubmissive-ness [2ndash4] Among them P represents the positive ornegative degree of an individualrsquos emotional state A rep-resents the physiological activation level of an individualrsquosnerves indicating that the individual is excited or depressedand D represents the subjective control state an active orpassive state of the individual to the situation and others

-e PAD emotional evaluation model as the most fa-mousmodel in emotional dimension theory has been widelyused in emotional research In this part of the study we

tested the applicability of the five-dimensional emotionmodel by comparing it with the PAD emotional model

32 Contrast Experiment of Accuracy under the SameConditions For the PAD emotional model we used thesame sample and multilayer perceptron with the same inputlayer and hidden layers as the five-dimensional emotionalmodel for training

Lee et al at the Institute of Psychology ChineseAcademy of Sciences compiled the Chinese version of thePAD emotion scales and evaluated the PAD values of the 14basic emotions -is included basic emotions proposed bydifferent researchers such as happiness anger relaxationand fear [25] In the current research we used PAD valuescorresponding to seven basic emotions from the KDEFdatabase to analyze the correlation of the results -e PADvalues of the seven emotions are shown in Table 7

We analyzed the Pearson correlations between the PADvalues of emotions and the five sample AUs and took themin-max normalized coefficients as the feature weights ofthe PAD emotional evaluation model -e coefficients ob-tained are shown in Table 8

Based on these parameters we transformed the sampleAUs of the expression images of the KDEF database into thecorresponding PAD values and then trained the multilayerperceptron with the data After 6000000 iterations oftraining we were able to compare the accuracies of the PADemotional model and the five-dimensional emotion model-e final results are shown in Table 9

Based on this result we felt that the five-dimensionalemotion model was better than the PAD emotional evalu-ation model when applied to emotion recognition -ecomparison results are graphically shown in Figure 2

Table 4 Coefficients of the sample AUs corresponding to the five dimensions

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceIntensity 1000 0657 0000 0569 0341Pleasure 0325 1000 0156 0000 0346Control 0195 1000 0169 0000 0618Certainty 0000 0733 0609 0224 1000Tension 0648 0000 0755 1000 0163

Table 5 Model accuracy

Whole data Training set sampling data Test data078 079 073

Table 6 Model and volunteersrsquo recognition accuracy

ModelVolunteers

X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X074 054 064 066 061 058 068 067 069 067 066 061 068 064

6 Mobile Information Systems

4 Comparison between the Five-DimensionalEmotional Model and the MotivationalDimensional Model

-e PAD model proposes the existence of a dominancedimension in addition to the valence and arousal dimen-sions Lee [25] evaluated the values of the basic emotions onthe PAD dimension Since the three dimensions of the PADmodel are independent of each other we can eliminate theindividual dominance dimension and keep only the values ofpleasure and arousal for the valence and arousal dimensions

-e motivational dimensional model of affect points tothe existence of a motivational dimension in addition to thevalence and arousal dimensions Xia [26] designed experi-ments to assess the motivational dimension of basic emo-tions on a scale of 1ndash9 By normalizing his experimentalresults with Leersquos we can map them to a range of minus4 to 4 Wecan obtain a numerical representation of basic emotions onthe motivational dimension model which is shown inTable 10

We analyzed the Pearson correlations between themotivational dimension value and the five sample AUs andtook the min-max normalized coefficients as the featureweights of the motivational dimensional model -e coef-ficients obtained are shown in Table 11

Based on these parameters we transformed the sampleAUs of the expression images of the KDEF database into thecorresponding motivational dimension value and thentrained the multilayer perceptron with the data After2742366 iterations of training we were able to compare theaccuracy of the motivational dimensional model and thefive-dimensional emotion model -e final results are shownin Table 12

5 Application of the Five-DimensionalEmotion Model

Human emotion has a wide range of application prospectsand potential economic value in man-machine interactionelectronic education robotics entertainment the medicalfield and other areas So it has attracted a great deal ofattention in academia and industry In both arenas a largenumber of related projects about human emotion have beencarried out Some of the companies that have been interestedand involved in this research have been IBM in the UnitedStates Sony in Japan Philips in Europe and many othersChina has gradually attached importance to this kind ofresearch where it has been explored at the Beijing Universityof Science and Technology the Institute of Automation ofthe Chinese Academy of Sciences and others In a fewwords research on the five-dimensional emotion model as

Table 7 PAD values corresponding to seven emotions

EmotionAverage value

P A DHappiness 277 121 142Anger minus198 110 060Sad minus089 017 minus070Fear minus093 130 minus064Surprise 172 171 022Disgust minus180 040 067Calm minus053 minus125 minus084

Table 8 Coefficient of PAD values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceP 1000 0968 0488 0190 0000A 1000 0000 0293 0475 0150D 0930 1000 0000 0254 0635

Table 9 Comparison of accuracy between PAD and the five-dimensional emotion model

PAD Five-dimensional emotion modelWhole data 069 078Training set samplingdata 066 079

Test data 070 073

Whole data Training set sampling data Test data

090807060504030201

0

DifferenceFive-dimensional modelPAD model

Figure 2 Comparison of accuracy between PAD and the five-dimensional emotion model expressions

Mobile Information Systems 7

applied in emotion recognition has far-reaching scientificsignificance and extensive application value To this end wetook the adaptive man-machine interactive system based onthe five-dimensional emotion model as an example to il-lustrate its practical application

Man-machine interaction is a subject that studies thedesign evaluation and implementation of interactive com-puting systems for the convenience of human users Anadaptive interface is the mainstream of man-machine inter-action research and application Now with the development ofartificial intelligence emotion is more and more important inthe research of an adaptive interface-rough the introductionof the five-dimensional emotional model the man-machineinteractive interface can observe a userrsquos emotions so as toprovide a more user-friendly user experience

For a general user interface color and backgroundmusicare part of the core content Based on this we designed thestyle of our user interface to change when users were ex-periencing different emotions

Before we designed it we needed to make some adaptiveimprovements to the five-dimensional emotion modelBecause different people have different facial features be-havioral habits and uncontrollable expression changes thiswill lead to recognition errors if the model worked accordingto only one frame of expression while performing a con-tinuous emotional judgment We needed to make someimprovements to the model that is to prepare a queue oflength N each element of which was a final result of eachemotion detection When the queue was full the modelwould perform an enqueue or dequeue operation From thiswe would receive a userrsquos continuous emotion detectionresults N times Because emotion is a continuously changing

process this method would receive more accurate results byanalyzing the emotions continuously

According to Zhang Congcongrsquos experiment [27] inclassic orchestra Bachrsquos evaluation was significantly higherthan others his music in aminor key had P andD values thatwere significantly higher than the music in a major key andthe PAD values of his faster music were significantly higherthan those of his slower music In terms of color the A valueof longer wave color was higher than that of shorter wavesfor multicolor the PAD values of saturated and bright colorswere higher than those of soft and dark colors as a whole-ere was a nonlinear relationship between the D value andbrightness but in black and white the D value of black wassignificantly higher than that of white and gray

Based on the results of his study we captured a userrsquosexpression by accessing the userrsquos camera and then used thefive-dimensional emotion model to detect the userrsquos con-tinuous emotions and calculate the average value of thedimensions For this we could determine which dimensionsneeded to be strengthened most Finally through colorgradient animation music smooth switching and otherapplications of JavaScript the model displayed the interfacestyle and background music that could improve the requireddimension value and guide the userrsquos emotion in a positivedirection

For example when the user was in a low-pleasure andlow-control mood we could use high-saturation blue as themain style and Bachrsquos fast minor-keyed Orchestral Suite No 2in B Minor BWV 1067 VII Badinerie as the backgroundmusic to improve the userrsquos pleasure and control When theuser was in a high pleasure mood we could use high-satu-ration red as the main style and Bachrsquos fast major-keyed

Table 11 Coefficient of the motivational dimension values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceValence 0714 1000 0354 0136 0000Arousal 1000 0329 0200 0432 0000Motivational 0309 1000 0183 0000 0331

Table 12 Comparison of accuracy between the motivational dimensional model and the five-dimensional emotion model

Motivational dimensional model Five-dimensional emotion modelComplete data 071 078Training set sampling data 068 079Test data 071 073

Table 10 -e motivational dimension values corresponding to seven emotions

EmotionAverage value

Valence Arousal MotivationalHappiness 277 121 134Anger minus198 110 minus059Sad minus089 017 minus096Fear minus093 130 minus143Surprise 172 171 minus032Disgust minus180 040 minus136Calm minus053 minus125 minus003

8 Mobile Information Systems

Brandenburg Concerto No 2 in F BWV1047I [Allegro] asbackground music to improve the userrsquos intensity and keepthe user happy for a longer time

6 Conclusion

Artificial intelligence has played a promising and productiverole in many fields but an AI that cannot distinguish emotioncan only exist as a machine and loses much of the humanexperience -erefore we constructed the five-dimensionalemotion model in this paper and made research and ex-ploration in the field of emotion recognition -e highlightsare as follows

(i) We determined five dimensions of the proposedemotion model as intensity pleasure control cer-tainty and tension and simplified the AUs of theFACS Based on the KDEF database we determinedthe feature weights of the five dimensions corre-sponding to the sample AUs and used a multilayerperceptron to train our model with higher accuracythan recognition by human volunteers

(ii) We compared the five-dimensional emotion modelwith the traditional PAD emotional evaluationmodel under the same conditions and demonstratedthat our proposed model was better in terms ofaccuracy

(iii) Finally we discussed the practical application of thefive-dimensional emotion model with the illustra-tion of a prototype emotion-detecting interface asan example

Data Availability

-e data used in the study are available in Karolinska Di-rected Emotional Faces (KDEF) httpswwwkdefseindexhtml

Conflicts of Interest

-e authors declare that there are no conflicts of interestregarding the publication of this paper

References

[1] W James Functional Psychology Hubei Science amp Tech-nology Press Wuhan China 2016

[2] A Mehrabian ldquoFramework for a comprehensive descriptionand measurement of emotional statesrdquo Genetic Social andGeneral Psychology Monographs vol 121 no 3 pp 339ndash3611995

[3] A Mehrabian ldquoPleasure-arousal-dominance a generalframework for describing and measuring individual differ-ences in temperamentrdquo Current Psychology vol 14 no 4pp 261ndash292 1996

[4] A Mehrabian C Wihardja and E Ljunggren ldquoEmotionalcorrelates of preferences for situation-activity combinationsin everyday liferdquo Genetic Social and General PsychologyMonographs vol 123 no 4 pp 461ndash477 1997

[5] N H Frijda 9e Emotions pp 62ndash65 Cambridge UniversityPress Cambridge UK 1986

[6] C E Izard Human Emotions Springer New York NY USA1977

[7] L Wang ldquoResearch on Agent emotion model based on theOCC modelrdquo Microcomputer Information vol 23 2007

[8] J Han ldquoCognitive emotionmodel for eldercare robot in smarthomerdquo China Communications vol 12 no 4 pp 32ndash41 2015

[9] Z Mi ldquoResearch on dimension emotion recognition modelbased on ConvLSTM networkrdquo Computer Engineering andApplications 2020

[10] S Wang ldquoPerceptual image retrieval based on emotionmodelrdquo Journal of Circuits and Systems vol 8 no 6 2003

[11] C Fu C Liu C T Ishi and H Ishiguro ldquoMulti-modalityemotion recognition model with GAT-based multi-head in-ter-modality attentionrdquo Sensors vol 20 no 17 p 4894 2020

[12] H Cai ldquoEmotion classification model based on word em-bedding and CNNrdquo Application Research of Computersvol 33 no 10 2016

[13] P A Gable and E Harmon-Jones ldquoApproach-motivatedpositive affect reduces breadth of attentionrdquo PsychologicalScience vol 19 no 5 pp 476ndash482 2008

[14] P Gable and E Harmon-Jones ldquo-e blues broaden but thenasty narrowsrdquo Psychological Science vol 21 no 2pp 211ndash215 2010

[15] P A Gable and E Harmon-Jones ldquo-e effect of low versushigh approach-motivated positive affect on memory for pe-ripherally versus centrally presented informationrdquo Emotionvol 10 no 4 pp 599ndash603 2010

[16] P Gable and E Harmon-Jones ldquo-e motivational dimen-sional model of affect implications for breadth of attentionmemory and cognitive categorisationrdquo Cognition amp Emotionvol 24 no 2 pp 322ndash337 2010

[17] P Ekman W V Friesen and S S Tomkins9e Facial AffectScoring Technique (FAST) Consulting Psychologists PressPalo Alto CA USA 1971

[18] C A Smith and P C Ellsworth ldquoPatterns of cognitive ap-praisal in emotionrdquo Journal of Personality and Social Psy-chology vol 48 no 4 pp 813ndash838 1985

[19] J W Kalat and M N Shiota Emotions China Light IndustryPress Beijing China 2009

[20] Z Meng ldquoA preliminary attempt to determine the facialexpression patterns of babiesrdquo Acta Psychologica Sinica vol 1pp 55ndash61 1985

[21] C E Izard ldquoEmotions as motivationsrdquo in Nebraska Sym-posium on Motivation pp 163ndash200 University of NebraskaPress Lincoln NE USA 1978

[22] Z Meng ldquoWhy facial expressions can be used as objectiveindicators of emotion researchrdquo Acta Psychologica Sinicavol 2 pp 124ndash128 1987

[23] P Ekman W V Friesen and S S Tomkins9e Facial ActionCoding System (FACS) Consulting Psychologists Press PaloAlto CA USA 1978

[24] D Lundqvist A Flykt and A Ohman 9e Karolinska Di-rected Emotional Faces-KDEF Karolinska Institute SolnaSweden 1998 httpswwwkdefseindexhtml

Mobile Information Systems 9

[25] X Lee PAD 9ree-Dimensional Model China ComputerWorld Beijing China 2007

[26] M XiaConvergence and Avoidance of Efficacy andMotivationDimensions in Emotion Recognition and Efficacy PreferenceEffects Southwest University Chongqing China 2015

[27] C Zhang Emotional Relationship between Music and ColorEast China Normal University Shanghai China 2014

10 Mobile Information Systems

Page 6: TheStudyofaFive-DimensionalEmotionalModelforFacial ... · pictureasagroupandthereforehad980groupsofdata.We dividedthisdatainto880groupsfortraining,andthe remaining100groupswereusedastestdata

picture as a group and therefore had 980 groups of data Wedivided this data into 880 groups for training and theremaining 100 groups were used as test data

After 5180104 iterations of training the accuracy of themodel achieved is shown in Table 5

In order to verify the accuracy of the model we alsoinvited 12 volunteers of different ages to identify the same100 randomly selected facial images and calculated theiraccuracy At the same time we let the model identify the 100facial images -e final results are shown in Table 6

-e results show that the accuracy of the model rec-ognition after training was higher than that of humanvolunteers

3 Comparison between the Five-DimensionalEmotional Model and the PAD EmotionalEvaluation Model

31 PAD Emotional Evaluation Model -e PAD emotionalevaluation model is a model based on emotional dimensiontheory According to the evaluation method of semanticdifferences Mehrabian and Russell defined the three di-mensions of emotion as ldquoPrdquo for pleasurendashdispleasure ldquoArdquo forarousalndashnonarousal and ldquoDrdquo for dominancendashsubmissive-ness [2ndash4] Among them P represents the positive ornegative degree of an individualrsquos emotional state A rep-resents the physiological activation level of an individualrsquosnerves indicating that the individual is excited or depressedand D represents the subjective control state an active orpassive state of the individual to the situation and others

-e PAD emotional evaluation model as the most fa-mousmodel in emotional dimension theory has been widelyused in emotional research In this part of the study we

tested the applicability of the five-dimensional emotionmodel by comparing it with the PAD emotional model

32 Contrast Experiment of Accuracy under the SameConditions For the PAD emotional model we used thesame sample and multilayer perceptron with the same inputlayer and hidden layers as the five-dimensional emotionalmodel for training

Lee et al at the Institute of Psychology ChineseAcademy of Sciences compiled the Chinese version of thePAD emotion scales and evaluated the PAD values of the 14basic emotions -is included basic emotions proposed bydifferent researchers such as happiness anger relaxationand fear [25] In the current research we used PAD valuescorresponding to seven basic emotions from the KDEFdatabase to analyze the correlation of the results -e PADvalues of the seven emotions are shown in Table 7

We analyzed the Pearson correlations between the PADvalues of emotions and the five sample AUs and took themin-max normalized coefficients as the feature weights ofthe PAD emotional evaluation model -e coefficients ob-tained are shown in Table 8

Based on these parameters we transformed the sampleAUs of the expression images of the KDEF database into thecorresponding PAD values and then trained the multilayerperceptron with the data After 6000000 iterations oftraining we were able to compare the accuracies of the PADemotional model and the five-dimensional emotion model-e final results are shown in Table 9

Based on this result we felt that the five-dimensionalemotion model was better than the PAD emotional evalu-ation model when applied to emotion recognition -ecomparison results are graphically shown in Figure 2

Table 4 Coefficients of the sample AUs corresponding to the five dimensions

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceIntensity 1000 0657 0000 0569 0341Pleasure 0325 1000 0156 0000 0346Control 0195 1000 0169 0000 0618Certainty 0000 0733 0609 0224 1000Tension 0648 0000 0755 1000 0163

Table 5 Model accuracy

Whole data Training set sampling data Test data078 079 073

Table 6 Model and volunteersrsquo recognition accuracy

ModelVolunteers

X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X074 054 064 066 061 058 068 067 069 067 066 061 068 064

6 Mobile Information Systems

4 Comparison between the Five-DimensionalEmotional Model and the MotivationalDimensional Model

-e PAD model proposes the existence of a dominancedimension in addition to the valence and arousal dimen-sions Lee [25] evaluated the values of the basic emotions onthe PAD dimension Since the three dimensions of the PADmodel are independent of each other we can eliminate theindividual dominance dimension and keep only the values ofpleasure and arousal for the valence and arousal dimensions

-e motivational dimensional model of affect points tothe existence of a motivational dimension in addition to thevalence and arousal dimensions Xia [26] designed experi-ments to assess the motivational dimension of basic emo-tions on a scale of 1ndash9 By normalizing his experimentalresults with Leersquos we can map them to a range of minus4 to 4 Wecan obtain a numerical representation of basic emotions onthe motivational dimension model which is shown inTable 10

We analyzed the Pearson correlations between themotivational dimension value and the five sample AUs andtook the min-max normalized coefficients as the featureweights of the motivational dimensional model -e coef-ficients obtained are shown in Table 11

Based on these parameters we transformed the sampleAUs of the expression images of the KDEF database into thecorresponding motivational dimension value and thentrained the multilayer perceptron with the data After2742366 iterations of training we were able to compare theaccuracy of the motivational dimensional model and thefive-dimensional emotion model -e final results are shownin Table 12

5 Application of the Five-DimensionalEmotion Model

Human emotion has a wide range of application prospectsand potential economic value in man-machine interactionelectronic education robotics entertainment the medicalfield and other areas So it has attracted a great deal ofattention in academia and industry In both arenas a largenumber of related projects about human emotion have beencarried out Some of the companies that have been interestedand involved in this research have been IBM in the UnitedStates Sony in Japan Philips in Europe and many othersChina has gradually attached importance to this kind ofresearch where it has been explored at the Beijing Universityof Science and Technology the Institute of Automation ofthe Chinese Academy of Sciences and others In a fewwords research on the five-dimensional emotion model as

Table 7 PAD values corresponding to seven emotions

EmotionAverage value

P A DHappiness 277 121 142Anger minus198 110 060Sad minus089 017 minus070Fear minus093 130 minus064Surprise 172 171 022Disgust minus180 040 067Calm minus053 minus125 minus084

Table 8 Coefficient of PAD values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceP 1000 0968 0488 0190 0000A 1000 0000 0293 0475 0150D 0930 1000 0000 0254 0635

Table 9 Comparison of accuracy between PAD and the five-dimensional emotion model

PAD Five-dimensional emotion modelWhole data 069 078Training set samplingdata 066 079

Test data 070 073

Whole data Training set sampling data Test data

090807060504030201

0

DifferenceFive-dimensional modelPAD model

Figure 2 Comparison of accuracy between PAD and the five-dimensional emotion model expressions

Mobile Information Systems 7

applied in emotion recognition has far-reaching scientificsignificance and extensive application value To this end wetook the adaptive man-machine interactive system based onthe five-dimensional emotion model as an example to il-lustrate its practical application

Man-machine interaction is a subject that studies thedesign evaluation and implementation of interactive com-puting systems for the convenience of human users Anadaptive interface is the mainstream of man-machine inter-action research and application Now with the development ofartificial intelligence emotion is more and more important inthe research of an adaptive interface-rough the introductionof the five-dimensional emotional model the man-machineinteractive interface can observe a userrsquos emotions so as toprovide a more user-friendly user experience

For a general user interface color and backgroundmusicare part of the core content Based on this we designed thestyle of our user interface to change when users were ex-periencing different emotions

Before we designed it we needed to make some adaptiveimprovements to the five-dimensional emotion modelBecause different people have different facial features be-havioral habits and uncontrollable expression changes thiswill lead to recognition errors if the model worked accordingto only one frame of expression while performing a con-tinuous emotional judgment We needed to make someimprovements to the model that is to prepare a queue oflength N each element of which was a final result of eachemotion detection When the queue was full the modelwould perform an enqueue or dequeue operation From thiswe would receive a userrsquos continuous emotion detectionresults N times Because emotion is a continuously changing

process this method would receive more accurate results byanalyzing the emotions continuously

According to Zhang Congcongrsquos experiment [27] inclassic orchestra Bachrsquos evaluation was significantly higherthan others his music in aminor key had P andD values thatwere significantly higher than the music in a major key andthe PAD values of his faster music were significantly higherthan those of his slower music In terms of color the A valueof longer wave color was higher than that of shorter wavesfor multicolor the PAD values of saturated and bright colorswere higher than those of soft and dark colors as a whole-ere was a nonlinear relationship between the D value andbrightness but in black and white the D value of black wassignificantly higher than that of white and gray

Based on the results of his study we captured a userrsquosexpression by accessing the userrsquos camera and then used thefive-dimensional emotion model to detect the userrsquos con-tinuous emotions and calculate the average value of thedimensions For this we could determine which dimensionsneeded to be strengthened most Finally through colorgradient animation music smooth switching and otherapplications of JavaScript the model displayed the interfacestyle and background music that could improve the requireddimension value and guide the userrsquos emotion in a positivedirection

For example when the user was in a low-pleasure andlow-control mood we could use high-saturation blue as themain style and Bachrsquos fast minor-keyed Orchestral Suite No 2in B Minor BWV 1067 VII Badinerie as the backgroundmusic to improve the userrsquos pleasure and control When theuser was in a high pleasure mood we could use high-satu-ration red as the main style and Bachrsquos fast major-keyed

Table 11 Coefficient of the motivational dimension values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceValence 0714 1000 0354 0136 0000Arousal 1000 0329 0200 0432 0000Motivational 0309 1000 0183 0000 0331

Table 12 Comparison of accuracy between the motivational dimensional model and the five-dimensional emotion model

Motivational dimensional model Five-dimensional emotion modelComplete data 071 078Training set sampling data 068 079Test data 071 073

Table 10 -e motivational dimension values corresponding to seven emotions

EmotionAverage value

Valence Arousal MotivationalHappiness 277 121 134Anger minus198 110 minus059Sad minus089 017 minus096Fear minus093 130 minus143Surprise 172 171 minus032Disgust minus180 040 minus136Calm minus053 minus125 minus003

8 Mobile Information Systems

Brandenburg Concerto No 2 in F BWV1047I [Allegro] asbackground music to improve the userrsquos intensity and keepthe user happy for a longer time

6 Conclusion

Artificial intelligence has played a promising and productiverole in many fields but an AI that cannot distinguish emotioncan only exist as a machine and loses much of the humanexperience -erefore we constructed the five-dimensionalemotion model in this paper and made research and ex-ploration in the field of emotion recognition -e highlightsare as follows

(i) We determined five dimensions of the proposedemotion model as intensity pleasure control cer-tainty and tension and simplified the AUs of theFACS Based on the KDEF database we determinedthe feature weights of the five dimensions corre-sponding to the sample AUs and used a multilayerperceptron to train our model with higher accuracythan recognition by human volunteers

(ii) We compared the five-dimensional emotion modelwith the traditional PAD emotional evaluationmodel under the same conditions and demonstratedthat our proposed model was better in terms ofaccuracy

(iii) Finally we discussed the practical application of thefive-dimensional emotion model with the illustra-tion of a prototype emotion-detecting interface asan example

Data Availability

-e data used in the study are available in Karolinska Di-rected Emotional Faces (KDEF) httpswwwkdefseindexhtml

Conflicts of Interest

-e authors declare that there are no conflicts of interestregarding the publication of this paper

References

[1] W James Functional Psychology Hubei Science amp Tech-nology Press Wuhan China 2016

[2] A Mehrabian ldquoFramework for a comprehensive descriptionand measurement of emotional statesrdquo Genetic Social andGeneral Psychology Monographs vol 121 no 3 pp 339ndash3611995

[3] A Mehrabian ldquoPleasure-arousal-dominance a generalframework for describing and measuring individual differ-ences in temperamentrdquo Current Psychology vol 14 no 4pp 261ndash292 1996

[4] A Mehrabian C Wihardja and E Ljunggren ldquoEmotionalcorrelates of preferences for situation-activity combinationsin everyday liferdquo Genetic Social and General PsychologyMonographs vol 123 no 4 pp 461ndash477 1997

[5] N H Frijda 9e Emotions pp 62ndash65 Cambridge UniversityPress Cambridge UK 1986

[6] C E Izard Human Emotions Springer New York NY USA1977

[7] L Wang ldquoResearch on Agent emotion model based on theOCC modelrdquo Microcomputer Information vol 23 2007

[8] J Han ldquoCognitive emotionmodel for eldercare robot in smarthomerdquo China Communications vol 12 no 4 pp 32ndash41 2015

[9] Z Mi ldquoResearch on dimension emotion recognition modelbased on ConvLSTM networkrdquo Computer Engineering andApplications 2020

[10] S Wang ldquoPerceptual image retrieval based on emotionmodelrdquo Journal of Circuits and Systems vol 8 no 6 2003

[11] C Fu C Liu C T Ishi and H Ishiguro ldquoMulti-modalityemotion recognition model with GAT-based multi-head in-ter-modality attentionrdquo Sensors vol 20 no 17 p 4894 2020

[12] H Cai ldquoEmotion classification model based on word em-bedding and CNNrdquo Application Research of Computersvol 33 no 10 2016

[13] P A Gable and E Harmon-Jones ldquoApproach-motivatedpositive affect reduces breadth of attentionrdquo PsychologicalScience vol 19 no 5 pp 476ndash482 2008

[14] P Gable and E Harmon-Jones ldquo-e blues broaden but thenasty narrowsrdquo Psychological Science vol 21 no 2pp 211ndash215 2010

[15] P A Gable and E Harmon-Jones ldquo-e effect of low versushigh approach-motivated positive affect on memory for pe-ripherally versus centrally presented informationrdquo Emotionvol 10 no 4 pp 599ndash603 2010

[16] P Gable and E Harmon-Jones ldquo-e motivational dimen-sional model of affect implications for breadth of attentionmemory and cognitive categorisationrdquo Cognition amp Emotionvol 24 no 2 pp 322ndash337 2010

[17] P Ekman W V Friesen and S S Tomkins9e Facial AffectScoring Technique (FAST) Consulting Psychologists PressPalo Alto CA USA 1971

[18] C A Smith and P C Ellsworth ldquoPatterns of cognitive ap-praisal in emotionrdquo Journal of Personality and Social Psy-chology vol 48 no 4 pp 813ndash838 1985

[19] J W Kalat and M N Shiota Emotions China Light IndustryPress Beijing China 2009

[20] Z Meng ldquoA preliminary attempt to determine the facialexpression patterns of babiesrdquo Acta Psychologica Sinica vol 1pp 55ndash61 1985

[21] C E Izard ldquoEmotions as motivationsrdquo in Nebraska Sym-posium on Motivation pp 163ndash200 University of NebraskaPress Lincoln NE USA 1978

[22] Z Meng ldquoWhy facial expressions can be used as objectiveindicators of emotion researchrdquo Acta Psychologica Sinicavol 2 pp 124ndash128 1987

[23] P Ekman W V Friesen and S S Tomkins9e Facial ActionCoding System (FACS) Consulting Psychologists Press PaloAlto CA USA 1978

[24] D Lundqvist A Flykt and A Ohman 9e Karolinska Di-rected Emotional Faces-KDEF Karolinska Institute SolnaSweden 1998 httpswwwkdefseindexhtml

Mobile Information Systems 9

[25] X Lee PAD 9ree-Dimensional Model China ComputerWorld Beijing China 2007

[26] M XiaConvergence and Avoidance of Efficacy andMotivationDimensions in Emotion Recognition and Efficacy PreferenceEffects Southwest University Chongqing China 2015

[27] C Zhang Emotional Relationship between Music and ColorEast China Normal University Shanghai China 2014

10 Mobile Information Systems

Page 7: TheStudyofaFive-DimensionalEmotionalModelforFacial ... · pictureasagroupandthereforehad980groupsofdata.We dividedthisdatainto880groupsfortraining,andthe remaining100groupswereusedastestdata

4 Comparison between the Five-DimensionalEmotional Model and the MotivationalDimensional Model

-e PAD model proposes the existence of a dominancedimension in addition to the valence and arousal dimen-sions Lee [25] evaluated the values of the basic emotions onthe PAD dimension Since the three dimensions of the PADmodel are independent of each other we can eliminate theindividual dominance dimension and keep only the values ofpleasure and arousal for the valence and arousal dimensions

-e motivational dimensional model of affect points tothe existence of a motivational dimension in addition to thevalence and arousal dimensions Xia [26] designed experi-ments to assess the motivational dimension of basic emo-tions on a scale of 1ndash9 By normalizing his experimentalresults with Leersquos we can map them to a range of minus4 to 4 Wecan obtain a numerical representation of basic emotions onthe motivational dimension model which is shown inTable 10

We analyzed the Pearson correlations between themotivational dimension value and the five sample AUs andtook the min-max normalized coefficients as the featureweights of the motivational dimensional model -e coef-ficients obtained are shown in Table 11

Based on these parameters we transformed the sampleAUs of the expression images of the KDEF database into thecorresponding motivational dimension value and thentrained the multilayer perceptron with the data After2742366 iterations of training we were able to compare theaccuracy of the motivational dimensional model and thefive-dimensional emotion model -e final results are shownin Table 12

5 Application of the Five-DimensionalEmotion Model

Human emotion has a wide range of application prospectsand potential economic value in man-machine interactionelectronic education robotics entertainment the medicalfield and other areas So it has attracted a great deal ofattention in academia and industry In both arenas a largenumber of related projects about human emotion have beencarried out Some of the companies that have been interestedand involved in this research have been IBM in the UnitedStates Sony in Japan Philips in Europe and many othersChina has gradually attached importance to this kind ofresearch where it has been explored at the Beijing Universityof Science and Technology the Institute of Automation ofthe Chinese Academy of Sciences and others In a fewwords research on the five-dimensional emotion model as

Table 7 PAD values corresponding to seven emotions

EmotionAverage value

P A DHappiness 277 121 142Anger minus198 110 060Sad minus089 017 minus070Fear minus093 130 minus064Surprise 172 171 022Disgust minus180 040 067Calm minus053 minus125 minus084

Table 8 Coefficient of PAD values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceP 1000 0968 0488 0190 0000A 1000 0000 0293 0475 0150D 0930 1000 0000 0254 0635

Table 9 Comparison of accuracy between PAD and the five-dimensional emotion model

PAD Five-dimensional emotion modelWhole data 069 078Training set samplingdata 066 079

Test data 070 073

Whole data Training set sampling data Test data

090807060504030201

0

DifferenceFive-dimensional modelPAD model

Figure 2 Comparison of accuracy between PAD and the five-dimensional emotion model expressions

Mobile Information Systems 7

applied in emotion recognition has far-reaching scientificsignificance and extensive application value To this end wetook the adaptive man-machine interactive system based onthe five-dimensional emotion model as an example to il-lustrate its practical application

Man-machine interaction is a subject that studies thedesign evaluation and implementation of interactive com-puting systems for the convenience of human users Anadaptive interface is the mainstream of man-machine inter-action research and application Now with the development ofartificial intelligence emotion is more and more important inthe research of an adaptive interface-rough the introductionof the five-dimensional emotional model the man-machineinteractive interface can observe a userrsquos emotions so as toprovide a more user-friendly user experience

For a general user interface color and backgroundmusicare part of the core content Based on this we designed thestyle of our user interface to change when users were ex-periencing different emotions

Before we designed it we needed to make some adaptiveimprovements to the five-dimensional emotion modelBecause different people have different facial features be-havioral habits and uncontrollable expression changes thiswill lead to recognition errors if the model worked accordingto only one frame of expression while performing a con-tinuous emotional judgment We needed to make someimprovements to the model that is to prepare a queue oflength N each element of which was a final result of eachemotion detection When the queue was full the modelwould perform an enqueue or dequeue operation From thiswe would receive a userrsquos continuous emotion detectionresults N times Because emotion is a continuously changing

process this method would receive more accurate results byanalyzing the emotions continuously

According to Zhang Congcongrsquos experiment [27] inclassic orchestra Bachrsquos evaluation was significantly higherthan others his music in aminor key had P andD values thatwere significantly higher than the music in a major key andthe PAD values of his faster music were significantly higherthan those of his slower music In terms of color the A valueof longer wave color was higher than that of shorter wavesfor multicolor the PAD values of saturated and bright colorswere higher than those of soft and dark colors as a whole-ere was a nonlinear relationship between the D value andbrightness but in black and white the D value of black wassignificantly higher than that of white and gray

Based on the results of his study we captured a userrsquosexpression by accessing the userrsquos camera and then used thefive-dimensional emotion model to detect the userrsquos con-tinuous emotions and calculate the average value of thedimensions For this we could determine which dimensionsneeded to be strengthened most Finally through colorgradient animation music smooth switching and otherapplications of JavaScript the model displayed the interfacestyle and background music that could improve the requireddimension value and guide the userrsquos emotion in a positivedirection

For example when the user was in a low-pleasure andlow-control mood we could use high-saturation blue as themain style and Bachrsquos fast minor-keyed Orchestral Suite No 2in B Minor BWV 1067 VII Badinerie as the backgroundmusic to improve the userrsquos pleasure and control When theuser was in a high pleasure mood we could use high-satu-ration red as the main style and Bachrsquos fast major-keyed

Table 11 Coefficient of the motivational dimension values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceValence 0714 1000 0354 0136 0000Arousal 1000 0329 0200 0432 0000Motivational 0309 1000 0183 0000 0331

Table 12 Comparison of accuracy between the motivational dimensional model and the five-dimensional emotion model

Motivational dimensional model Five-dimensional emotion modelComplete data 071 078Training set sampling data 068 079Test data 071 073

Table 10 -e motivational dimension values corresponding to seven emotions

EmotionAverage value

Valence Arousal MotivationalHappiness 277 121 134Anger minus198 110 minus059Sad minus089 017 minus096Fear minus093 130 minus143Surprise 172 171 minus032Disgust minus180 040 minus136Calm minus053 minus125 minus003

8 Mobile Information Systems

Brandenburg Concerto No 2 in F BWV1047I [Allegro] asbackground music to improve the userrsquos intensity and keepthe user happy for a longer time

6 Conclusion

Artificial intelligence has played a promising and productiverole in many fields but an AI that cannot distinguish emotioncan only exist as a machine and loses much of the humanexperience -erefore we constructed the five-dimensionalemotion model in this paper and made research and ex-ploration in the field of emotion recognition -e highlightsare as follows

(i) We determined five dimensions of the proposedemotion model as intensity pleasure control cer-tainty and tension and simplified the AUs of theFACS Based on the KDEF database we determinedthe feature weights of the five dimensions corre-sponding to the sample AUs and used a multilayerperceptron to train our model with higher accuracythan recognition by human volunteers

(ii) We compared the five-dimensional emotion modelwith the traditional PAD emotional evaluationmodel under the same conditions and demonstratedthat our proposed model was better in terms ofaccuracy

(iii) Finally we discussed the practical application of thefive-dimensional emotion model with the illustra-tion of a prototype emotion-detecting interface asan example

Data Availability

-e data used in the study are available in Karolinska Di-rected Emotional Faces (KDEF) httpswwwkdefseindexhtml

Conflicts of Interest

-e authors declare that there are no conflicts of interestregarding the publication of this paper

References

[1] W James Functional Psychology Hubei Science amp Tech-nology Press Wuhan China 2016

[2] A Mehrabian ldquoFramework for a comprehensive descriptionand measurement of emotional statesrdquo Genetic Social andGeneral Psychology Monographs vol 121 no 3 pp 339ndash3611995

[3] A Mehrabian ldquoPleasure-arousal-dominance a generalframework for describing and measuring individual differ-ences in temperamentrdquo Current Psychology vol 14 no 4pp 261ndash292 1996

[4] A Mehrabian C Wihardja and E Ljunggren ldquoEmotionalcorrelates of preferences for situation-activity combinationsin everyday liferdquo Genetic Social and General PsychologyMonographs vol 123 no 4 pp 461ndash477 1997

[5] N H Frijda 9e Emotions pp 62ndash65 Cambridge UniversityPress Cambridge UK 1986

[6] C E Izard Human Emotions Springer New York NY USA1977

[7] L Wang ldquoResearch on Agent emotion model based on theOCC modelrdquo Microcomputer Information vol 23 2007

[8] J Han ldquoCognitive emotionmodel for eldercare robot in smarthomerdquo China Communications vol 12 no 4 pp 32ndash41 2015

[9] Z Mi ldquoResearch on dimension emotion recognition modelbased on ConvLSTM networkrdquo Computer Engineering andApplications 2020

[10] S Wang ldquoPerceptual image retrieval based on emotionmodelrdquo Journal of Circuits and Systems vol 8 no 6 2003

[11] C Fu C Liu C T Ishi and H Ishiguro ldquoMulti-modalityemotion recognition model with GAT-based multi-head in-ter-modality attentionrdquo Sensors vol 20 no 17 p 4894 2020

[12] H Cai ldquoEmotion classification model based on word em-bedding and CNNrdquo Application Research of Computersvol 33 no 10 2016

[13] P A Gable and E Harmon-Jones ldquoApproach-motivatedpositive affect reduces breadth of attentionrdquo PsychologicalScience vol 19 no 5 pp 476ndash482 2008

[14] P Gable and E Harmon-Jones ldquo-e blues broaden but thenasty narrowsrdquo Psychological Science vol 21 no 2pp 211ndash215 2010

[15] P A Gable and E Harmon-Jones ldquo-e effect of low versushigh approach-motivated positive affect on memory for pe-ripherally versus centrally presented informationrdquo Emotionvol 10 no 4 pp 599ndash603 2010

[16] P Gable and E Harmon-Jones ldquo-e motivational dimen-sional model of affect implications for breadth of attentionmemory and cognitive categorisationrdquo Cognition amp Emotionvol 24 no 2 pp 322ndash337 2010

[17] P Ekman W V Friesen and S S Tomkins9e Facial AffectScoring Technique (FAST) Consulting Psychologists PressPalo Alto CA USA 1971

[18] C A Smith and P C Ellsworth ldquoPatterns of cognitive ap-praisal in emotionrdquo Journal of Personality and Social Psy-chology vol 48 no 4 pp 813ndash838 1985

[19] J W Kalat and M N Shiota Emotions China Light IndustryPress Beijing China 2009

[20] Z Meng ldquoA preliminary attempt to determine the facialexpression patterns of babiesrdquo Acta Psychologica Sinica vol 1pp 55ndash61 1985

[21] C E Izard ldquoEmotions as motivationsrdquo in Nebraska Sym-posium on Motivation pp 163ndash200 University of NebraskaPress Lincoln NE USA 1978

[22] Z Meng ldquoWhy facial expressions can be used as objectiveindicators of emotion researchrdquo Acta Psychologica Sinicavol 2 pp 124ndash128 1987

[23] P Ekman W V Friesen and S S Tomkins9e Facial ActionCoding System (FACS) Consulting Psychologists Press PaloAlto CA USA 1978

[24] D Lundqvist A Flykt and A Ohman 9e Karolinska Di-rected Emotional Faces-KDEF Karolinska Institute SolnaSweden 1998 httpswwwkdefseindexhtml

Mobile Information Systems 9

[25] X Lee PAD 9ree-Dimensional Model China ComputerWorld Beijing China 2007

[26] M XiaConvergence and Avoidance of Efficacy andMotivationDimensions in Emotion Recognition and Efficacy PreferenceEffects Southwest University Chongqing China 2015

[27] C Zhang Emotional Relationship between Music and ColorEast China Normal University Shanghai China 2014

10 Mobile Information Systems

Page 8: TheStudyofaFive-DimensionalEmotionalModelforFacial ... · pictureasagroupandthereforehad980groupsofdata.We dividedthisdatainto880groupsfortraining,andthe remaining100groupswereusedastestdata

applied in emotion recognition has far-reaching scientificsignificance and extensive application value To this end wetook the adaptive man-machine interactive system based onthe five-dimensional emotion model as an example to il-lustrate its practical application

Man-machine interaction is a subject that studies thedesign evaluation and implementation of interactive com-puting systems for the convenience of human users Anadaptive interface is the mainstream of man-machine inter-action research and application Now with the development ofartificial intelligence emotion is more and more important inthe research of an adaptive interface-rough the introductionof the five-dimensional emotional model the man-machineinteractive interface can observe a userrsquos emotions so as toprovide a more user-friendly user experience

For a general user interface color and backgroundmusicare part of the core content Based on this we designed thestyle of our user interface to change when users were ex-periencing different emotions

Before we designed it we needed to make some adaptiveimprovements to the five-dimensional emotion modelBecause different people have different facial features be-havioral habits and uncontrollable expression changes thiswill lead to recognition errors if the model worked accordingto only one frame of expression while performing a con-tinuous emotional judgment We needed to make someimprovements to the model that is to prepare a queue oflength N each element of which was a final result of eachemotion detection When the queue was full the modelwould perform an enqueue or dequeue operation From thiswe would receive a userrsquos continuous emotion detectionresults N times Because emotion is a continuously changing

process this method would receive more accurate results byanalyzing the emotions continuously

According to Zhang Congcongrsquos experiment [27] inclassic orchestra Bachrsquos evaluation was significantly higherthan others his music in aminor key had P andD values thatwere significantly higher than the music in a major key andthe PAD values of his faster music were significantly higherthan those of his slower music In terms of color the A valueof longer wave color was higher than that of shorter wavesfor multicolor the PAD values of saturated and bright colorswere higher than those of soft and dark colors as a whole-ere was a nonlinear relationship between the D value andbrightness but in black and white the D value of black wassignificantly higher than that of white and gray

Based on the results of his study we captured a userrsquosexpression by accessing the userrsquos camera and then used thefive-dimensional emotion model to detect the userrsquos con-tinuous emotions and calculate the average value of thedimensions For this we could determine which dimensionsneeded to be strengthened most Finally through colorgradient animation music smooth switching and otherapplications of JavaScript the model displayed the interfacestyle and background music that could improve the requireddimension value and guide the userrsquos emotion in a positivedirection

For example when the user was in a low-pleasure andlow-control mood we could use high-saturation blue as themain style and Bachrsquos fast minor-keyed Orchestral Suite No 2in B Minor BWV 1067 VII Badinerie as the backgroundmusic to improve the userrsquos pleasure and control When theuser was in a high pleasure mood we could use high-satu-ration red as the main style and Bachrsquos fast major-keyed

Table 11 Coefficient of the motivational dimension values corresponding to sample AUs

mouth_open mouth_rad eyes_open eyebrows_height eyebrows_distanceValence 0714 1000 0354 0136 0000Arousal 1000 0329 0200 0432 0000Motivational 0309 1000 0183 0000 0331

Table 12 Comparison of accuracy between the motivational dimensional model and the five-dimensional emotion model

Motivational dimensional model Five-dimensional emotion modelComplete data 071 078Training set sampling data 068 079Test data 071 073

Table 10 -e motivational dimension values corresponding to seven emotions

EmotionAverage value

Valence Arousal MotivationalHappiness 277 121 134Anger minus198 110 minus059Sad minus089 017 minus096Fear minus093 130 minus143Surprise 172 171 minus032Disgust minus180 040 minus136Calm minus053 minus125 minus003

8 Mobile Information Systems

Brandenburg Concerto No 2 in F BWV1047I [Allegro] asbackground music to improve the userrsquos intensity and keepthe user happy for a longer time

6 Conclusion

Artificial intelligence has played a promising and productiverole in many fields but an AI that cannot distinguish emotioncan only exist as a machine and loses much of the humanexperience -erefore we constructed the five-dimensionalemotion model in this paper and made research and ex-ploration in the field of emotion recognition -e highlightsare as follows

(i) We determined five dimensions of the proposedemotion model as intensity pleasure control cer-tainty and tension and simplified the AUs of theFACS Based on the KDEF database we determinedthe feature weights of the five dimensions corre-sponding to the sample AUs and used a multilayerperceptron to train our model with higher accuracythan recognition by human volunteers

(ii) We compared the five-dimensional emotion modelwith the traditional PAD emotional evaluationmodel under the same conditions and demonstratedthat our proposed model was better in terms ofaccuracy

(iii) Finally we discussed the practical application of thefive-dimensional emotion model with the illustra-tion of a prototype emotion-detecting interface asan example

Data Availability

-e data used in the study are available in Karolinska Di-rected Emotional Faces (KDEF) httpswwwkdefseindexhtml

Conflicts of Interest

-e authors declare that there are no conflicts of interestregarding the publication of this paper

References

[1] W James Functional Psychology Hubei Science amp Tech-nology Press Wuhan China 2016

[2] A Mehrabian ldquoFramework for a comprehensive descriptionand measurement of emotional statesrdquo Genetic Social andGeneral Psychology Monographs vol 121 no 3 pp 339ndash3611995

[3] A Mehrabian ldquoPleasure-arousal-dominance a generalframework for describing and measuring individual differ-ences in temperamentrdquo Current Psychology vol 14 no 4pp 261ndash292 1996

[4] A Mehrabian C Wihardja and E Ljunggren ldquoEmotionalcorrelates of preferences for situation-activity combinationsin everyday liferdquo Genetic Social and General PsychologyMonographs vol 123 no 4 pp 461ndash477 1997

[5] N H Frijda 9e Emotions pp 62ndash65 Cambridge UniversityPress Cambridge UK 1986

[6] C E Izard Human Emotions Springer New York NY USA1977

[7] L Wang ldquoResearch on Agent emotion model based on theOCC modelrdquo Microcomputer Information vol 23 2007

[8] J Han ldquoCognitive emotionmodel for eldercare robot in smarthomerdquo China Communications vol 12 no 4 pp 32ndash41 2015

[9] Z Mi ldquoResearch on dimension emotion recognition modelbased on ConvLSTM networkrdquo Computer Engineering andApplications 2020

[10] S Wang ldquoPerceptual image retrieval based on emotionmodelrdquo Journal of Circuits and Systems vol 8 no 6 2003

[11] C Fu C Liu C T Ishi and H Ishiguro ldquoMulti-modalityemotion recognition model with GAT-based multi-head in-ter-modality attentionrdquo Sensors vol 20 no 17 p 4894 2020

[12] H Cai ldquoEmotion classification model based on word em-bedding and CNNrdquo Application Research of Computersvol 33 no 10 2016

[13] P A Gable and E Harmon-Jones ldquoApproach-motivatedpositive affect reduces breadth of attentionrdquo PsychologicalScience vol 19 no 5 pp 476ndash482 2008

[14] P Gable and E Harmon-Jones ldquo-e blues broaden but thenasty narrowsrdquo Psychological Science vol 21 no 2pp 211ndash215 2010

[15] P A Gable and E Harmon-Jones ldquo-e effect of low versushigh approach-motivated positive affect on memory for pe-ripherally versus centrally presented informationrdquo Emotionvol 10 no 4 pp 599ndash603 2010

[16] P Gable and E Harmon-Jones ldquo-e motivational dimen-sional model of affect implications for breadth of attentionmemory and cognitive categorisationrdquo Cognition amp Emotionvol 24 no 2 pp 322ndash337 2010

[17] P Ekman W V Friesen and S S Tomkins9e Facial AffectScoring Technique (FAST) Consulting Psychologists PressPalo Alto CA USA 1971

[18] C A Smith and P C Ellsworth ldquoPatterns of cognitive ap-praisal in emotionrdquo Journal of Personality and Social Psy-chology vol 48 no 4 pp 813ndash838 1985

[19] J W Kalat and M N Shiota Emotions China Light IndustryPress Beijing China 2009

[20] Z Meng ldquoA preliminary attempt to determine the facialexpression patterns of babiesrdquo Acta Psychologica Sinica vol 1pp 55ndash61 1985

[21] C E Izard ldquoEmotions as motivationsrdquo in Nebraska Sym-posium on Motivation pp 163ndash200 University of NebraskaPress Lincoln NE USA 1978

[22] Z Meng ldquoWhy facial expressions can be used as objectiveindicators of emotion researchrdquo Acta Psychologica Sinicavol 2 pp 124ndash128 1987

[23] P Ekman W V Friesen and S S Tomkins9e Facial ActionCoding System (FACS) Consulting Psychologists Press PaloAlto CA USA 1978

[24] D Lundqvist A Flykt and A Ohman 9e Karolinska Di-rected Emotional Faces-KDEF Karolinska Institute SolnaSweden 1998 httpswwwkdefseindexhtml

Mobile Information Systems 9

[25] X Lee PAD 9ree-Dimensional Model China ComputerWorld Beijing China 2007

[26] M XiaConvergence and Avoidance of Efficacy andMotivationDimensions in Emotion Recognition and Efficacy PreferenceEffects Southwest University Chongqing China 2015

[27] C Zhang Emotional Relationship between Music and ColorEast China Normal University Shanghai China 2014

10 Mobile Information Systems

Page 9: TheStudyofaFive-DimensionalEmotionalModelforFacial ... · pictureasagroupandthereforehad980groupsofdata.We dividedthisdatainto880groupsfortraining,andthe remaining100groupswereusedastestdata

Brandenburg Concerto No 2 in F BWV1047I [Allegro] asbackground music to improve the userrsquos intensity and keepthe user happy for a longer time

6 Conclusion

Artificial intelligence has played a promising and productiverole in many fields but an AI that cannot distinguish emotioncan only exist as a machine and loses much of the humanexperience -erefore we constructed the five-dimensionalemotion model in this paper and made research and ex-ploration in the field of emotion recognition -e highlightsare as follows

(i) We determined five dimensions of the proposedemotion model as intensity pleasure control cer-tainty and tension and simplified the AUs of theFACS Based on the KDEF database we determinedthe feature weights of the five dimensions corre-sponding to the sample AUs and used a multilayerperceptron to train our model with higher accuracythan recognition by human volunteers

(ii) We compared the five-dimensional emotion modelwith the traditional PAD emotional evaluationmodel under the same conditions and demonstratedthat our proposed model was better in terms ofaccuracy

(iii) Finally we discussed the practical application of thefive-dimensional emotion model with the illustra-tion of a prototype emotion-detecting interface asan example

Data Availability

-e data used in the study are available in Karolinska Di-rected Emotional Faces (KDEF) httpswwwkdefseindexhtml

Conflicts of Interest

-e authors declare that there are no conflicts of interestregarding the publication of this paper

References

[1] W James Functional Psychology Hubei Science amp Tech-nology Press Wuhan China 2016

[2] A Mehrabian ldquoFramework for a comprehensive descriptionand measurement of emotional statesrdquo Genetic Social andGeneral Psychology Monographs vol 121 no 3 pp 339ndash3611995

[3] A Mehrabian ldquoPleasure-arousal-dominance a generalframework for describing and measuring individual differ-ences in temperamentrdquo Current Psychology vol 14 no 4pp 261ndash292 1996

[4] A Mehrabian C Wihardja and E Ljunggren ldquoEmotionalcorrelates of preferences for situation-activity combinationsin everyday liferdquo Genetic Social and General PsychologyMonographs vol 123 no 4 pp 461ndash477 1997

[5] N H Frijda 9e Emotions pp 62ndash65 Cambridge UniversityPress Cambridge UK 1986

[6] C E Izard Human Emotions Springer New York NY USA1977

[7] L Wang ldquoResearch on Agent emotion model based on theOCC modelrdquo Microcomputer Information vol 23 2007

[8] J Han ldquoCognitive emotionmodel for eldercare robot in smarthomerdquo China Communications vol 12 no 4 pp 32ndash41 2015

[9] Z Mi ldquoResearch on dimension emotion recognition modelbased on ConvLSTM networkrdquo Computer Engineering andApplications 2020

[10] S Wang ldquoPerceptual image retrieval based on emotionmodelrdquo Journal of Circuits and Systems vol 8 no 6 2003

[11] C Fu C Liu C T Ishi and H Ishiguro ldquoMulti-modalityemotion recognition model with GAT-based multi-head in-ter-modality attentionrdquo Sensors vol 20 no 17 p 4894 2020

[12] H Cai ldquoEmotion classification model based on word em-bedding and CNNrdquo Application Research of Computersvol 33 no 10 2016

[13] P A Gable and E Harmon-Jones ldquoApproach-motivatedpositive affect reduces breadth of attentionrdquo PsychologicalScience vol 19 no 5 pp 476ndash482 2008

[14] P Gable and E Harmon-Jones ldquo-e blues broaden but thenasty narrowsrdquo Psychological Science vol 21 no 2pp 211ndash215 2010

[15] P A Gable and E Harmon-Jones ldquo-e effect of low versushigh approach-motivated positive affect on memory for pe-ripherally versus centrally presented informationrdquo Emotionvol 10 no 4 pp 599ndash603 2010

[16] P Gable and E Harmon-Jones ldquo-e motivational dimen-sional model of affect implications for breadth of attentionmemory and cognitive categorisationrdquo Cognition amp Emotionvol 24 no 2 pp 322ndash337 2010

[17] P Ekman W V Friesen and S S Tomkins9e Facial AffectScoring Technique (FAST) Consulting Psychologists PressPalo Alto CA USA 1971

[18] C A Smith and P C Ellsworth ldquoPatterns of cognitive ap-praisal in emotionrdquo Journal of Personality and Social Psy-chology vol 48 no 4 pp 813ndash838 1985

[19] J W Kalat and M N Shiota Emotions China Light IndustryPress Beijing China 2009

[20] Z Meng ldquoA preliminary attempt to determine the facialexpression patterns of babiesrdquo Acta Psychologica Sinica vol 1pp 55ndash61 1985

[21] C E Izard ldquoEmotions as motivationsrdquo in Nebraska Sym-posium on Motivation pp 163ndash200 University of NebraskaPress Lincoln NE USA 1978

[22] Z Meng ldquoWhy facial expressions can be used as objectiveindicators of emotion researchrdquo Acta Psychologica Sinicavol 2 pp 124ndash128 1987

[23] P Ekman W V Friesen and S S Tomkins9e Facial ActionCoding System (FACS) Consulting Psychologists Press PaloAlto CA USA 1978

[24] D Lundqvist A Flykt and A Ohman 9e Karolinska Di-rected Emotional Faces-KDEF Karolinska Institute SolnaSweden 1998 httpswwwkdefseindexhtml

Mobile Information Systems 9

[25] X Lee PAD 9ree-Dimensional Model China ComputerWorld Beijing China 2007

[26] M XiaConvergence and Avoidance of Efficacy andMotivationDimensions in Emotion Recognition and Efficacy PreferenceEffects Southwest University Chongqing China 2015

[27] C Zhang Emotional Relationship between Music and ColorEast China Normal University Shanghai China 2014

10 Mobile Information Systems

Page 10: TheStudyofaFive-DimensionalEmotionalModelforFacial ... · pictureasagroupandthereforehad980groupsofdata.We dividedthisdatainto880groupsfortraining,andthe remaining100groupswereusedastestdata

[25] X Lee PAD 9ree-Dimensional Model China ComputerWorld Beijing China 2007

[26] M XiaConvergence and Avoidance of Efficacy andMotivationDimensions in Emotion Recognition and Efficacy PreferenceEffects Southwest University Chongqing China 2015

[27] C Zhang Emotional Relationship between Music and ColorEast China Normal University Shanghai China 2014

10 Mobile Information Systems