a new tool to support diagnosis of neurological disorders by means facial expressions

Upload: marco-suma

Post on 06-Apr-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/3/2019 A New Tool to Support Diagnosis of Neurological Disorders by Means Facial Expressions

    1/6

    A new tool to support diagnosis of neurological

    disorders by means of facial expressions

    Vitoantonio Bevilacqua12, Dario DAmbruoso1,Giovanni Mandolino

    1,Marco Suma

    1

    1Dipartimento di Elettrotecnica ed Elettronica,

    Politecnico di Bari, Via Orabona, 4, Bari, 70125, Italy2

    e.B.I.S. s.r.l. (electronic Business in Security), Spin-Off of

    Polytechnic of Bari Via Pavoncelli, 139 Bari Italy

    corresponding author:[email protected]

    AbstractRecognizing facial emotions is an important aspect ofinterpersonal communication that may be impaired in various

    neurological disorders: Aspergers syndrome, Autism, SchizoidPersonality, Parkinsonism, Urbach-Wiethe, Amyotrophic Lateral

    Sclerosis, Bipolar Disorder, Depression, Alzheimer's desease.

    Altough it is not possible to define unique emotions, we can say

    that are mental states, physiological and psychophysiological

    changes associated with internal or external stimuli, both natural

    and learned. This paper highlights certain requirements that the

    specification approach would need to meet if the production of

    such tools were to be achievable. In particular, we present an

    innovative and still experimental tool to support diagnosis of

    neurological disorders by means of facial-expressions-

    monitoring. At the same time, we propose a new study to

    measure several impairments of patients recognizing emotions

    ability, and to improve the reliability of using them in computer

    aided diagnosis strategies.

    Keywords: Emotion Recognition; Action Units Recognition; Apathy;Absence Expression; Flattening Expression; Neurological Disorders.

    I. INTRODUCTIONAbility to produce and recognize facial expressions of

    emotion is an important component of interpersonalcommunication in humans and primates. Cultures may differ insocial rules and customs, however certain facial expressions ofemotion are universally recognized. These universal emotionsinclude happiness, sadness, anger, fear, disgust, surprise. Theword "emotion" comes from the Latin "ex" that means out, and"motio", that means moving, so "emotion" indicates a

    movement, in particular a body movement.

    Ekman's theory is based on a experiment analysis andcross-cultural comparison (Ekman et. al., 1972). It has beenobserved as an emotional expression within a specific

    population is interpreted correctly and uniformly by other ones,and vice versa. We can define as primary those expressions veycommon in the whole of humanity, it means that there arecommon emotions they generate, and these can be defined as

    primary (FIGURE 1). Ekman has thus identified six primaryemotions:

    Happiness;Surprise;

    Disgust;Anger;Fear;

    Sadness.

    FIGURE 1.Examples of Emotional Expressions, from left to right: Mild (row1) and Extreme (row 2) expressions.

    While universal emotions are recognized across differentcultural and ethnic groups, social emotions, such as guilt,shame, arrogance, admiration and flirtatiousness, are particularof culture specific interactions.

    Processing of affect is supported by distributed neuralsystems, and as lesions studies show, preferentially by the righthemisphere. On account of human and primate studies, corticalareas of the inferior occipital gyrus, fusiform gyrus and inferior

    temporal gyrus have been reported as essential for face processing. The amygdala (FIGURE 2) receives input fromthese cortical areas and from the visual pathways of themidbrain and thalamus [1]. The amygdala is mainly involved inthe processing of fear, but is activated upon passive andimplicit presentations of facial expressions of emotions ingeneral. Orbitofrontal areas, in particular on the right, areinvolved in explicit identification of facial emotions.

  • 8/3/2019 A New Tool to Support Diagnosis of Neurological Disorders by Means Facial Expressions

    2/6

    FIGURE 2. Section image of an anatomical limbic lobe.

    Amygdala and frontal areas can modulate sensory areas viafeedback mechanisms, can engage other cortical areas, such ashippocampus and neocortex, and can generate a physicalresponse via connections to motor cortex, hypothalamus, and

    brainstem.

    There are two main theories regarding the hemisphericspecialization in processing emotions. The right hemispherichypothesis posits that the right hemisphere is specialized to

    process all emotions, whereas the valence hypothesis regardsthe right hemisphere as superior for processing negative andthe left hemisphere as superior for positive emotions. Commonnegative emotions are sadness, fear and anger, common

    positive emotions are happiness and possibly, surprise.

    While cognition dysfunction has been thoroughly evaluatedacross neuropsychiatric disorders, impairments in emotionrecognition have received increasing attention within the past15 years. Early investigations on emotion recognition werelimited to groups of persons with schizophrenia, depressionand brain injury, however within the past decade, emotionrecognition deficits have been explored in a wide range of

    brain related disorders. Now we will analyze the neurologicaldisorder linked to emotional recognition deficits and markedimpairment in the use of facial expressions.

    II. EMOTIONAL RECOGNITION DEFICITIn the Asperger's Disorder there is a marked impairment in

    the use of multiple non verbal behaviors such as eye-to-eyegaze, facial expressions, body postures, and gestures to regulatesocial interaction, a lack of spontaneous seeking to shareenjoyment [2].

    In the Parkinsonism there is known a decrease ofspontaneous facial expressions, gestures, speech, or bodymovements.

    Usually in the Schizoid Personality Disorder is displayed a"bland" exterior without visible emotional reactivity and rarelyreciprocate gestures or facial expressions, such as smiles ornods. They rarely experience strong emotions such as angerand joy.

    In the Major Depressive there is a depressed mood that canbe inferred from the person facial expressions and demeanors.

    Some individuals emphasize somatic complaints (e.g., bodilyaches and pains) rather than reporting feelings of sadness.Many individuals report or exhibit increased irritability (e.g.,

    persistent anger, a tendency to respond to events with angry).In the Autistic Disorder is known a marked impairment in

    the use of multiple nonverbal behaviors, such as eye-to-eyegaze, facial expressions [3].

    III. IMPAIRMENTINTHEUSEOFFACIALEXPRESSIONIn Schizophrenia the emotion processing deficits may relate

    to dysfunction of mesial temporal regions. Schizophrenia patients are impaired in overall emotion recognition,particularly fear and disgust, and the neutral cues are frequentlymisidentified as unpleasant or threatening.

    The subjects affected of Bipolar Disorder suffer fromspecific deficits of facial emotion perception. The patients

    present impaired recognition of disgust, fear and sadness [4].

    The Urbach-Wiethe (lipoid proteinosis) is a rare disorderwith progressive calcification and necrosis of the amygdala.The patients with this disease showed a disproportionate

    impairment in recognizing fear in facial expressions, and only amuch milder impairment in recognizing the intensity of othernegative emotions [5].

    In patients with Alzheimer (AD) desease may be a deficit inprocessing some or all facial expressions of emotions [6].

    Emotional recognition deficits occur in bulbarAmyotrophic Lateral Sclerosis (ALS), particularly withemotional facial expressions, and can arise independent ofdepressive and dementia symptoms or co-morbidity withdepression and dementia. These findings expand the scope ofcognitive dysfunction detected in ALS, and bolster the view ofALS as a multisystem disorder involving cognitive and motor

    deficits [7].

    IV. ACTION UNITSReplicating the studies by Charles Darwin, the Canadian

    psychologist Paul Ekman has confirmed that an importantfeature of basic emotions is the fact that they are expresseduniversally. The face, as is known, is the space where, in acommunication process, is concentrated most of the sensoryinformation, whether they exhibited (issuer) or read (receiver).

    According to Ekman and Friesen "the face is a system ofmulti-signal response, multi-message capable of tremendousflexibility and specificity". These signals are combined to

    produce different messages, and the movement of facial

    muscles pull the skin, temporarily distorting the shape of theeyes, eyebrows, lips and the appearance of folds, wrinkles andfacial swelling in different parts of the face. Before describingthe tools, we explain and discuss the meaning of Action Units(AUs) [8].

    AUs are the basic elements for the construction of anexpression, they represent minimal facial actions, that are notfurther separated into more simple actions. The muscle actionsand Action Unit (AU) do not agree: in fact, an AU maycorrespond to the action of one or more of the same musclesand a muscle can be associated with several AUs. Au is, in

  • 8/3/2019 A New Tool to Support Diagnosis of Neurological Disorders by Means Facial Expressions

    3/6

    short, a basic change in appearance of the face, caused byactivation of one or more facial muscles. The AUs are dividedinto groups according to position and/or the type of actioninvolved. First AUs are in the UPPER FACE, and affect theeyebrows, forehead, and eyelids. Then the LOWER FACEAUs are presented in a file groups: Up/Down, Horizontal,Oblique, Orbital, and Miscellaneous.

    V. METHODSThe basic idea is to track the face, after detecting it by a

    C++ software library for finding features in faces (STASM)[9] or Automatic Facial Feature Points Detection [10].

    Stasm extends the Active Shape Model of Tim Cootes andhis colleagues with other techniques. Before searching forfacial features, Stasm uses the Rowley or the Viola Jones facedetector to locate the overall position of the face. The STASMis a software that fits a standard shape in the frontal faceimages with Active Shape Model (ASM) algorithm. Theshapes are fitted by moving the points along a line orthogonalto the boundary called Whisker. The points movement ismade by weighing changes in levels of gray along the whisker.Each point has an associated profile vector and the image issampled along the one-dimensional whisker. After the shape isfitted, the STASM creates the log file and the image with theoverlapped shape. The algorithm begins with the acquisitionof the shape coordinates points from the log file. Afterselecting the face, Stasm tracks the fiducial points. You giveit an image of a face and it returns the positions of the facialfeatures. It also allows you to build your own Active ShapeModels (FIGURE 3).

    FIGURE 3. The mean face" (yellow) positioned over a search face at the

    start of a search.

    In this tool we used a modified version of Automatic

    Facial Feature Points Detection that made a face segmentationin which it localizes the various face components (eyebrows,eyes, mouth, nose). It now uses Machine Perception Toolbox(MPT) [11] to find face and eyes. After this, the frame isconverted in a gray-scale image and binaryzed; the darkestclusters contain face features such as eyes, mouth, nostrils andeyebrows. It detects 17 features points (FIGURE 4) usingimage processing and anthropometrical considerations: thetwo pupils, the four eye corners, the four eyebrow corners, thetwo nostrils, the nose tip, the two mouth corners, the upperand lower lip extremity. It is adapted to work in real time.

    FIGURE 4. Identification features points on the face.

    VI. FROM ACTION UNIT TO EMOTIONThe state of the art related to encoding of emotions

    according to Facial Action Coding System (FACS) [8],allowed us to compile a matrix (TABLE I) in which each AUis associated with expressions (and emotions) in which itoccurs. Below are listed only three of all AUs, precisely theones we have selected as indicative of the difference between

    positive and negative attitudes.

    TABLE I. COMPARE SAME AU

    AU 1

    Inner Brow Raiser

    Fear

    Sadness

    Surprise[12]

    Fear

    Sadness

    Surprise [13]

    Fear 100%

    Sadness 100%

    Surprise 87.5%

    Disgust 37.5% [17]

    Fear

    Sadness

    Surprise [15]

    Fear

    Sadness

    Surprise [16]

    AU 4

    Brow lowerer

    Anger

    Fear

    Sadness [12]

    Anger

    Fear

    Sadness [13]

    Sadness 100%

    Disgust 87,5%

    Anger 85,5%

    Anger

    Fear

    Sadness [15]

    Anger

    Fear

    Sadness [16]

    Fear 75%

    Embarrassment 12,5%

    Surprise 37,5% [17]

  • 8/3/2019 A New Tool to Support Diagnosis of Neurological Disorders by Means Facial Expressions

    4/6

    AU 12

    Lip corner puller

    Happy [12] Happy [13] Happy [14]

    Happy [15] Happy [16]Joy 100%

    Happy 50% [17]

    VII. FACIAL ACTION UNITS RECOGNITIONAnalyzing the photos at our disposal, as shown in the

    TABLE I, we can conclude that to distinguish between positiveand negative emotions is not necessary to evaluate all the AUs,and then we use AU 1 (FIGURE 6.a) or AUs 4 (FIGURE 6.b),for negative expressions, and AU 12 (FIGURE 6.c) for the

    positive ones.

    FIGURE 6. This images represent particular Aus, in the order: Inner Brow

    Raiser, Brow lowerer, Lip corner puller.To evaluate AUs we calculate the variation of the polygons

    areas as shown in (FIGURE 7).

    AU-1 is revealed when the polygon created with the brow-corners and the eye-corners raises; it is strongly related in

    sadness and fear emotion.AU-4 is revealed when the same polygon of AU-1

    decreases; it is related in anger and disgust emotions.

    AU-12 is revealed when the polygon created around themouth is constant; it is strongly related in happiness emotion(FIGURE 8.a).

    Furthermore the tool is able to recognize some usefulcombinations of other AUs, which discriminate expressions ofdiagnostic importance (FIGURE 8.c).

    FIGURE 7. This image shows the neutral situation.

    FIGURE 8. This images explain how the areas change in relation with

    emotions.

    Using close links between AUs and emotions, this tooldisplays the emotions obtained, with the relative intensity. atthe moment this tool evaluates intensity of emotions through

    the region of polygons: when the region raises, the intensityincreases. Obviously the intensity of the emotion is due to theappearance of wrinkles and furrow in well-determinate regionsof the face.

    VIII. IMPLEMENTATIONThis tool is completely written in C++, it uses the OpenCV

    library for image processing, MPT for face and eyes detection.The Graphical User Interface (GUI) is implemented with Qtlibrary; it is divided in two areas: the first one shows theprocessing frame (Figure 9), the second one shows theemotional state (Figure 10).

  • 8/3/2019 A New Tool to Support Diagnosis of Neurological Disorders by Means Facial Expressions

    5/6

    FIGURE 9. This image shows one of two areas of the GUI: processing frame.

    FIGURE 10. This image shows an example of emotion processing.

    The emotional state is divided in:

    1. Emotional details: it contains the emotionalpercentage about the six emotions;

    2. Emotional situation: it contains the overallsituation of the human subject.

    IX. CONCLUSIONSNeurodegenerative diseases are more expensive for Public

    Health; they are frequently detected only when it is too late,inhibiting the possibility of efficacious therapy. The solution isthe prevention, in particular for people over-fifty.

    The target of this tool is to provide an instrument for anearly diagnosis in the initial stage of the disease, overworkingthe flat-affect of these patients. The flat-affect is the scientificterm which represents the impaired recognition of facialexpressions of emotions [2]. This non-invasive instrument

    provides rapid results for possible or probable cases;obviously it is not a deterministic method.

    There is a direct relationship between neurological distressand expressivity of some emotions. For example depression

    presents persistence of anger and sadness. The tool aims toultimately recognize the expressions of a face, and thentranslates them into positive and negative attitudes. Theultimate aim of all this is to help to diagnose neurologicaldistress that have been a loss of expressiveness in the face: a

    photo or video is shown to the patient, he responds bychanging the facial expression, the tool decodes the expressionintensity or its absence. Therefore, when the tool reveals a lossof expressivity, i.e. the subject does not react adequately to theto the external stimuli, it signals the potentially presence ofdisease. Apathy is among the most frequent behavioraldisturbances present in patients with Alzheimer's disease. So,this application can associate the absence or flatteningexpression of emotions, to some neurological disorders, suchas Alzheimer and Depression.

    X. REFERENCES[1] Christian G. Kohler, MD, Travis H. Turner, BS, Raquel E. Gur, Ruben

    C. Gur et. al, Recognition of Facial Emotions in NeuropsychiatricDisorders, CNS Spectrums, vol. 9, no. 4, pp. 267-274, 2004;

    [2] Diagnostic and Statistical Manual of Mental Disorders Fourth Edition,Washington D.C, American Psychiatric Association, 1994;

    [3] Alan J.Lerner, MD, Diagnostic Criteria in Neuorology, 2006;[4] Cristiana C. de Almeida Rocca, Eveline V. D. Heuvel, Sheila C.

    Caetano, Beny Lafer, Facial Emotion Recognition in Bipolar Disorder: aCritical Review, Rev. Bras. Psiquiatr., vol. 31, no. 2, So Paulo, 2009;

    [5] Chiara Cristinzio, David Sander, Patrik Vuilleumier, Recognition ofEmotional Face Expressions and Amygdala Pathology, Epileptologie,vol. 24, pp. 130138, 2007;

    [6] Anselmo-Gallagher, Christian G.M.D., Gerri B.A., Bilker, Kohler CG,Karlawish, et. al, Emotion-Discrimination Deficits in Mild AlzheimerDisease, Am J Geriatr Psychiatry, vol. 13, pp. 926933, 2005;

    [7] Zimmerman EK, Eslinger PJ, Emotional Perception Deficits inAmyotrophic Lateral Sclerosis, Dep. of Neurology, College ofMedicine, Penn State/Hershey Medical Center, Cogn Behav Neurol.,vol. 20, no. 2, pp. 7982, 2007;

    [8] Paul Ekman, FACS: Facial Action Coding System, Research Nexusdivision of Network Information Research Corporation, Salt Lake City,UT 84107, 2002;

    [9] S.Milborrow and F. Nicolls, Locating Facial Features with an ExtendedActive Shape Model, ECCV 2008, Part IV, LNCS 5305, pp. 504513,2008;

    [10] Vitoantonio Bevilacqua, Alessandro Ciccimarra, Ilenia Leone, andGiuseppe Mastronardi,Automatic Facial Feature Points Detection LNAI5227, Edited by:D.-S. Huang, pp. 11421149, 2008;

    [11] The machine perception toolbox, University of California San Diego,2005, [Online]. Available: http://mplab.ucsd.edu/grants/project1/free-

    software/mptwebsite/introduction.html

    [12] Irene Kotsiay Ioannis Pitasy, Facial Expression Recognition in ImageSequences using Geometric Deformation Features and Support Vector,Ieee Transactions on Image Processing, vol. 16, no. 1, 2007;

    [13] Bridget M.Waller, Jamese J.Cry, Anne M.Burrows, Selection forUniversal Facial Emotion, Emotion, vol. 8, no. 3, pp. 435439, 2008;

    [14] Hadi Seyedarabi, Won-Sook Lee, Classification of Upper and LowerFace Action Units and Facial Expressions using Hybrid TrackingSystem and Probabilistic Neural Networks, Proceedings of the 5thWSEAS International Conference on Signal Processing, pp. 161-166,2006;

  • 8/3/2019 A New Tool to Support Diagnosis of Neurological Disorders by Means Facial Expressions

    6/6

    [15] Wallace V. Friesen, Paul Ekman, Emfacs, Emotional Facial ActionCoding System. Unpublished Manual, University of California,California, 1984;

    [16] Paul Ekman, Wallace V. Friesen, Joseph C. Hager, Facial Action CodingSystem Investigators Guide, Published by A Human Face, 666 MalibuDrive, Salt Lake City UT 84107, 2002;

    [17] Worth a thousand words: Absolute and Relative Decoding of Nonlinguistic Affect Vocalizations, Skyler T. Hawk, Gerben A. Van

    Kleef, Agneta H. Fischer, Job Van Der Schal, Emotion, vol 9, Issue 3,pp. 293-305, 2009.

    [18] Anatomical basic of facial expression learning tool, Victoria ContrerasFlores, 2005, [Online]. Available:http://www.artnatomia.net/uk/artnatomiaIng.html