brain dysfunctions during facial discrimination in schizophrenia: selective association to affect...

7
Brain dysfunctions during facial discrimination in schizophrenia: Selective association to affect decoding Javier Quintana a,b, , Junghee Lee a,b , Michael Marcus a,b , Kimmy Kee a,b,c , Tiffany Wong a,b , Armen Yerevanian a,b a Department of Psychiatry and Biobehavioral Sciences, UCLA, Los Angeles CA 90095, United States b Greater Los Angeles VA Healthcare System, VSN22 Mental Illness, Research, Education and Clinical Center (MIRECC), 11301 Wilshire Blvd, Bldg 210, Los Angeles CA 90073, United States c Department of Psychology, California State University at Channel Islands, Camarillo CA 93012, United States abstract article info Article history: Received 13 November 2009 Received in revised form 2 July 2010 Accepted 8 September 2010 Keywords: Cognitive decits Social cognition Facial information processing Compensatory mechanisms Schizophrenia patients exhibit impaired facial affect perception, yet the exact nature of this impairment remains unclear. We investigated neural activity related to processing facial emotional and non-emotional information and complex images in 12 schizophrenia patients and 15 healthy controls using functional magnetic resonance imaging. All subjects performed a facial information processing task with three conditions: matching facial emotion, matching facial identity, and matching complex visual patterns. Patients and controls showed comparable behavioral performance in all task conditions. The neural activation patterns in schizophrenia patients and healthy controls were distinctly different while processing affect-related facial information but not other non-emotional facial features. During emotion matching, orbital frontal cortex and left amydala activations were found in controls but not in patients. When comparing emotion versus identity matching, controls activated the fusiform and middle temporal gyri, left superior temporal gyrus, and right inferior and middle frontal gyrus, whereas schizophrenia patients only activated the middle and inferior frontal gyri, the frontal operculi and the right insular cortex. Our ndings suggest that schizophrenia patients and healthy controls may utilize different neural networks when processing facial emotional information. Published by Elsevier Ireland Ltd. 1. Introduction Decits in social functioning are a hallmark of schizophrenia, commonly observed in patients with the condition. Understanding the abnormalities in neural mechanisms that may underlie these decits has been the goal of an increasing number of studies, many of them focusing on how schizophrenia patients process socially relevant information. Facial affect processing, in particular, has been frequently studied in schizophrenia. Schizophrenia patients have difculty identifying and discriminating emotional expression of faces (for a review, see Mandal et al., 1998). The impairment is present both in rst-episode (Edwards et al., 2001) as well as in chronic patients (Addington and Addington, 1998). It is associated with poor social functioning: schizophrenia patients with worse affect perception exhibit lower levels of interpersonal, social, and occupational functioning (Ihnen et al., 1998; Kee et al., 2003; Brekke et al., 2005). Despite the growing number of studies on facial affect perception decits in schizophrenia, it is still unclear at which stage of processing the decits occur and which neural abnormalities may cause them. Whereas the latter has been investigated in several studies (Gur et al., 2002; Quintana et al., 2003b; Fakra et al., 2008), neither these nor other behavioral studies have claried the former. The human face is one of most familiar yet complex visual stimuli that we encounter everyday, and deciphering facial information efciently is crucial for successful social functioning. Discriminating facial affect information requires following at least three processing steps, namely 1) process complex visual stimuli, 2) encode and process basic facial features, and 3) decode emotional information of face stimuli (Wynn et al., 2008). These three components are often associated with distinct neural activation patterns not only at the temporal but also at the spatial levels. Using event-related potentials (ERPs), for example, basic facial feature processing has been linked to an N170 component (Bentin et al., 1996; Eimer, 2000) whereas decoding emotional information from facial stimuli has been often linked to an N250 (Streit et al., 1999, 2001) and, at least implicitly, to early (120160 ms) components (Kawasaki et al., 2001). These temporally distinct steps in processing facial affect information are themselves associated with rather distinct, spatially identiable neural networks. In healthy individuals, the fusiform gyrus has been suggested as a key region for processing the so-called invariant features of the face (i.e., structural features of faces that are critical for distinguishing one face from other faces) (Bruce and Young, 1986; Haxby et al., 2000). Processing facial affect information at large involves a wide neural network that includes the fusiform gyrus, the middle and the superior temporal gyrus, and the Psychiatry Research: Neuroimaging 191 (2011) 4450 Corresponding author. Department of Psychiatry, UCLA-Greater Los Angeles VA Healthcare System, 11301 Wilshire Blvd, Bldg 210A (MIRECC), Room 128, Los Angeles, CA 90073, United States. Tel.: +1 310 478 3711x41009; fax: +1 310 268 4056. E-mail address: [email protected] (J. Quintana). 0925-4927/$ see front matter. Published by Elsevier Ireland Ltd. doi:10.1016/j.pscychresns.2010.09.005 Contents lists available at ScienceDirect Psychiatry Research: Neuroimaging journal homepage: www.elsevier.com/locate/psychresns

Upload: javier-quintana

Post on 11-Sep-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

Psychiatry Research: Neuroimaging 191 (2011) 44–50

Contents lists available at ScienceDirect

Psychiatry Research: Neuroimaging

j ourna l homepage: www.e lsev ie r.com/ locate /psychresns

Brain dysfunctions during facial discrimination in schizophrenia: Selectiveassociation to affect decoding

Javier Quintanaa,b,⁎, Junghee Leea,b, Michael Marcusa,b, Kimmy Keea,b,c, Tiffany Wonga,b, Armen Yerevaniana,b

aDepartment of Psychiatry and Biobehavioral Sciences, UCLA, Los Angeles CA 90095, United StatesbGreater Los Angeles VA Healthcare System, VSN22 Mental Illness, Research, Education and Clinical Center (MIRECC), 11301 Wilshire Blvd, Bldg 210, Los Angeles CA 90073, United StatescDepartment of Psychology, California State University at Channel Islands, Camarillo CA 93012, United States

⁎ Corresponding author. Department of Psychiatry,Healthcare System, 11301Wilshire Blvd, Bldg 210A (MIRE90073, United States. Tel.: +1 310 478 3711x41009; fax:

E-mail address: [email protected] (J. Quintana).

0925-4927/$ – see front matter. Published by Elsevierdoi:10.1016/j.pscychresns.2010.09.005

a b s t r a c t

a r t i c l e i n f o

Article history:Received 13 November 2009Received in revised form 2 July 2010Accepted 8 September 2010

Keywords:Cognitive deficitsSocial cognitionFacial information processingCompensatory mechanisms

Schizophrenia patients exhibit impaired facial affect perception, yet the exact nature of this impairmentremains unclear. We investigated neural activity related to processing facial emotional and non-emotionalinformation and complex images in 12 schizophrenia patients and 15 healthy controls using functionalmagnetic resonance imaging. All subjects performed a facial information processing task with threeconditions: matching facial emotion, matching facial identity, and matching complex visual patterns. Patientsand controls showed comparable behavioral performance in all task conditions. The neural activation patternsin schizophrenia patients and healthy controls were distinctly different while processing affect-related facialinformation but not other non-emotional facial features. During emotion matching, orbital frontal cortex andleft amydala activations were found in controls but not in patients. When comparing emotion versus identitymatching, controls activated the fusiform and middle temporal gyri, left superior temporal gyrus, and rightinferior and middle frontal gyrus, whereas schizophrenia patients only activated the middle and inferiorfrontal gyri, the frontal operculi and the right insular cortex. Our findings suggest that schizophrenia patientsand healthy controls may utilize different neural networks when processing facial emotional information.

UCLA-Greater Los Angeles VACC), Room 128, Los Angeles, CA+1 310 268 4056.

Ireland Ltd.

Published by Elsevier Ireland Ltd.

1. Introduction

Deficits in social functioning are a hallmark of schizophrenia,commonly observed in patients with the condition. Understandingthe abnormalities in neural mechanisms that may underlie thesedeficits has been the goal of an increasing number of studies, many ofthem focusing on how schizophrenia patients process sociallyrelevant information. Facial affect processing, in particular, has beenfrequently studied in schizophrenia. Schizophrenia patients havedifficulty identifying and discriminating emotional expression of faces(for a review, see Mandal et al., 1998). The impairment is present bothin first-episode (Edwards et al., 2001) as well as in chronic patients(Addington and Addington, 1998). It is associated with poor socialfunctioning: schizophrenia patients with worse affect perceptionexhibit lower levels of interpersonal, social, and occupationalfunctioning (Ihnen et al., 1998; Kee et al., 2003; Brekke et al., 2005).Despite the growing number of studies on facial affect perceptiondeficits in schizophrenia, it is still unclear at which stage of processingthe deficits occur and which neural abnormalities may cause them.

Whereas the latter has been investigated in several studies (Gur et al.,2002; Quintana et al., 2003b; Fakra et al., 2008), neither these norother behavioral studies have clarified the former.

The human face is one of most familiar yet complex visual stimulithat we encounter everyday, and deciphering facial informationefficiently is crucial for successful social functioning. Discriminatingfacial affect information requires following at least three processingsteps, namely 1) process complex visual stimuli, 2) encode and processbasic facial features, and3)decode emotional informationof face stimuli(Wynn et al., 2008). These three components are often associated withdistinct neural activation patterns not only at the temporal but also atthe spatial levels. Using event-related potentials (ERPs), for example,basic facial feature processing has been linked to an N170 component(Bentin et al., 1996; Eimer, 2000) whereas decoding emotionalinformation from facial stimuli has been often linked to an N250 (Streitet al., 1999, 2001) and, at least implicitly, to early (120–160 ms)components (Kawasaki et al., 2001). These temporally distinct steps inprocessing facial affect information are themselves associated withrather distinct, spatially identifiable neural networks. In healthyindividuals, the fusiform gyrus has been suggested as a key region forprocessing the so-called invariant features of the face (i.e., structuralfeatures of faces that are critical for distinguishing one face from otherfaces) (Bruce and Young, 1986; Haxby et al., 2000). Processing facialaffect information at large involves a wide neural network that includesthe fusiform gyrus, themiddle and the superior temporal gyrus, and the

45J. Quintana et al. / Psychiatry Research: Neuroimaging 191 (2011) 44–50

limbic system including the amygdala (Adolphs et al., 1999; Critchley etal., 2000; Haxby et al., 2000). The activation of parts of this network, inparticular the amygdala, seems to be modulated by the way facialinformation is presented or by whether particular cognitive processesare required to process such information. For example, implicit affectprocessing is accompanied by larger amygdala activation than explicitaffect processing (Critchley et al., 2000; Hariri et al., 2000, 2003).

The contribution of each processing component to the facial affectperception impairment in schizophrenia patients has been the focusof several studies. Schizophrenia patients exhibit deficits in proces-sing basic visual stimuli (e.g., Butler et al., 2001), suggesting thatperhaps their overall impaired facial affect perception is not the soleresult of specific deficits in decoding facial affect. This is furtheremphasized by the finding, using a behavioral task, that schizophreniapatients exhibit deficits in visually detecting and discriminating non-emotional facial features (Chen et al., 2009). Studies using ERPsconvey a somewhat complicated picture of impaired facial affectrecognition in schizophrenia. Several of these studies have reported areduced amplitude of the N170 component—linked to basic facialfeature processing—in schizophrenia patients compared with controls(Onitsuka et al., 2006; Turetsky et al., 2007; Lynn and Salisbury, 2008;Lee et al., 2010), whereas others have shown emotion specificauthentication abnormalities in schizophrenia (Wynn et al., 2008).

We previously examined whether neural activation patterns duringfacial information processing are different in schizophrenia patients andin healthy controls (Quintana et al., 2003b). The purpose of that studywas to identify neural abnormalities related to general facial informa-tion processing in schizophrenia. Its elaborate experimental design(patients and controls split in subgroups, both patient-control sub-groups performing taskswith facial information discrimination require-ments but otherwise different cognitive demands) allowed us todetermine that, regardless of those other task demands or whetherthe facial information included affect, facial information processing atlarge was pervasively accompanied by a decreased activation in theright lateral fusiform gyrus of schizophrenia patients that could beobserved even at the individual level. Such an experimental task design,however, did not allow us to separately compare the processing ofemotional information and of basic facial features within or betweensubject groups. In addition, the study used simple shapes (i.e., circlesand ellipses) as stimuli in the task condition designed to control forgeneral visual processing. Hence, our findings did not clarify whetherfacial affect perception deficits in schizophrenia are the result of specificimpairments in any of the information processing steps involved (i.e.,complex visual stimuli and basic facial features or facial affect).

This new study was thus designed to further examine, usingfunctionalMagnetic Resonance Imaging (fMRI) in schizophrenia patientsand healthy controls, neural activation patterns associated with facialinformation processing with the goal of identifying the specificabnormalities underlying facial affect discrimination deficits in schizo-phrenia. Based on the results of our previous study,we hypothesized thatschizophrenia patients would exhibit activation abnormalities whileprocessing both basic facial features and facial affect, though likely of adifferent degree. To test our hypothesis, we wanted to compare howschizophrenia patients activate neural networks when processingemotional information versus basic facial features, while controlling fortheir ability to process complex visual stimuli. Hence, the study wouldenable us to directly assess putativeneural abnormalities on eachstage offacial affect processing in schizophrenia patients.

2. Methods

2.1. Subjects

Twelve (three female) schizophrenia patients and 15 (three female)healthy controls participated in the study. All subjects provided writtenconsent under protocols approved by the Institutional Review Boards of

the Greater Los Angeles VA HealthCare System (VAGLAHS) and UCLAafter their capacity to provide such consent was evaluated and theywere fully informed of the nature and consequences of the study.Schizophrenia patientswere recruited fromoutpatient treatment clinicsat the VAGLAHS and from local board and care facilities. They met thediagnostic criteria for schizophrenia using the Structured ClinicalInterview for DSM-IV Axis I Disorders (First et al., 1997). Exclusioncriteria for patients included: 1) substance abuse or dependence in thelast sixmonths, 2)mental retardation, 3) history of loss of consciousnessfor more than 1 h, 4) any identifiable neurological disorder, 5)insufficient fluency in English, and 6) a psychiatric hospital admission,clinical decompensationor changes inpsychotropicmedication over the6-month period prior to the study.

Healthy controls were recruited through flyers posted in the localcommunity. Exclusion criteria for control included: 1) history ofschizophrenia or other psychotic disorder, bipolar disorder, recurrentdepression, 2) history of substance abuse or dependence in the last sixmonths, 3) schizophrenia or other psychotic disorder in a first-degreerelative, 4) any significant neurological disorder or head injury, and 5)insufficient fluency in English.

Schizophrenia patients and healthy controls were comparable interms of age and education (age, 45.0±9.9 and 40.0±9.0, respectively,t25=1.36, p=0.18; education, 13.3±2.3 and 13.9±2.3, respectively,t18=−0.57, p=0.57). There was one left-handed subject (as deter-mined by self-report and by responses to a modified Edinburghhandedness questionnaire) (Oldfield, 1971) in each group. Hence,gender and handedness distribution were comparable for both groups(χ2=0.09, p=0.75 and χ2=0.02, p=0.86, respectively). All schizo-phrenia subjects were clinically stable chronic patients (duration ofillness=17.1±8.2), receiving antipsychotic agents (risperidone,N=4;olanzapine, N=2, aripiprazole N=1; thiothixene, N=1; ziprasidone,N=1; haloperidol, N=1; stelazine, N=1; unknown, N=1); three ofthe patients were also receiving the anticholinergic drug benztropine.Clinical symptoms for schizophrenia patients were evaluated with theBrief Psychiatric Rating Scale (BPRS) and the Scale for the Assessment ofNegative Symptoms (SANS). Mean BPRS score was 40.3±15.7 andmeanSANSscore (excluding a subscale of attention)was33.7±15.9.Allparticipants had normal or corrected to normal vision.

2.2. Experimental design and tasks

All participants completed three consecutive runs of a simultaneousfacial information matching (SFIM) task (see Fig. 1) in the MRI scanner.The task was presented to the subjects via MR-compatible LCD goggles(Resonance Technology, Northridge, CA) using a Macintosh laptopcomputer and a customized in-house software (Stimulus Creator).

Each run of the SFIM task lasted 292.5 s and consisted of five"control" and four "experimental" alternating blocks of trials. Eachblock lasted 32.5 s and consisted of an initial 2.5-s instruction (e.g.,"match emotions", "match identities", or "match patterns") followedby six 5-s trials. Each trial displayed three stimuli on a whitebackground in a triangular arrangement, one stimulus on the screen'stop center—the cue, and two stimuli side-by-side at the bottom of thescreen—the response choices. The subjects were instructed to pressthe left or right button of a magnet-compatible mouse device with theindex or the middle finger of their preferred hand to indicate thechoice stimulus (left or right, at the bottom of the screen) thatmatched the cue (displayed at the top) according to the instruction forthe block. Within each trial, the corresponding stimuli sets stayed onthe screen for the entire 5 s, even after the subject had made a choice.Each experimental block in a run was surrounded by two controlblocks. The order of experimental blocks within runs was counter-balanced across subjects of each group.

Each scanning run included two variations of a facial informationdiscrimination task (each task variation presented in either the first orthe last two of the four experimental blocks). For these two task

Fig. 1. Schematic diagram of the task performed by the subjects in the study. A. Schematic diagram of a run showing the temporal sequence and duration of its events. Each runcontains five 6-trial blocks of a control condition (pattern matching) alternating with four 6-trial blocks of two variations of a facial information discrimination task (emotionmatching and identity matching). Each task variation was presented in either the first or the last two of the four experimental blocks, and the order of experimental blocks withinruns was counterbalanced across subjects of each group. B. For emotion matching (E) and identity matching (ID), three faces on a white background were presented in a triangulararrangement, a cue on the top center and two response choices side-by-side at the bottom of the screen. For the pattern matching condition (P), three composite images (cue andtwo response choices) appeared in a triangular arrangement. On each trial, subjects chose which response choice matched the cue according to the task instruction presented at thebeginning of the block.

46 J. Quintana et al. / Psychiatry Research: Neuroimaging 191 (2011) 44–50

variations, black and white face pictures of five emotional expressions(i.e., happy, angry, fearful, surprised, and neutral) from the Ekman'sseries (Ekman, 2004) were used as cue and response choice stimuli.The first task variation, “matching affect”, required subjects to choosethe response choice face whose emotion matched to emotionalexpression of the cue face regardless of the depicted face's identity.The second variation, “matching identity”, asked subjects to select theresponse choice face whose identity matched the identity of the cueface regardless of emotional expression. For the control task,"matching patterns", the stimuli were composite images of over-lapping circles or squares whose luminance and contrast were similarto the face stimuli. Subjects were asked to choose which responsechoice composite image had a similar pattern than the cue image. Thecontrol images could be discriminated—and matched—by theirpattern of circles or squares distribution or by the type of elements(circles or squares) composing them (see Fig. 1).

2.3. Imaging data acquisition

Functional MRI images were acquired on a 3 Tesla Siemens Allegrascanner (Erlangen, Germany) located at the UCLA Ahmanson LovelaceBrain Mapping Center. All subjects lied on the scanner bed in the supineposition, wearing acoustic noise protection and a set of magnet-compatible LCD goggles. A high-resolution 3D MPRAGE data set wascollected from each individual to facilitate later coordinate alignments(TR/TE/TI/Matrix Size/Flip Angle/FOV/Slice Thickness=2300/2.9/1100/160×192×192/20/256×256/1 mm). Prior to functional image collec-tion and for anatomical reference, we obtained an Echo Planar axial

T2-weighted series from each subject (TR/TE/Matrix Size/Flip Angle/FOV/Slice Thickness=5000/33/128×128×30/90/200×200/2.8 mm),covering 30 horizontal slices parallel to the AC–PC plane and withbandwidth and plane of section matched to the BOLD data sets latercollected (with which they shared the same metric distortions). For thefunctional data sets themselves, we measured Blood Oxygenation LevelDependent (BOLD)-based signals using a T2*-weighted gradient-echosequence (TR/TE/Matrix Size/Flip Angle/FOV/Slice Thickness=2500/45/64×64×30/90/200×2002.8 mm)andacquiring imagesof slices parallelto the AC–PC plane that covered the entire volume of the brain.

2.4. Imaging data analysis

Imaging data were analyzed using the FMRIB Software Library (FSL)(Smith et al., 2004). Thepre-statistics imageprocessing includedmotioncorrection (Jenkinson et al., 2002), non-brain removal (Smith, 2002),spatial smoothing using a Gaussian kernel of FWMH 8mm, and highpass temporal filtering (Gaussianweighted LSF straight line fittingwithsigma=50.0 s). After pre-processing, functional images were normal-ized into standard (Montreal Neurological Institute) space using affinetransformation with FLIRT (Jenkinson and Smith, 2001).

Functional image data were analyzed using FEAT (FMRI ExpertAnalysis Tool) version 5.92, part of FSL, in successive steps. First, for eachrun and each subject, data from the emotion and identity matchingconditions were modeled by convolving them with a canonicalhemodynamic response function. Temporal derivatives and six motionparameters were included as covariates of no interest to increase

Table 2Brain regions activated during different aspects of facial information processing.

Hemisphere Regions Peak activation of ROI

x y z z value

Emotion matchingControls Left Fusiform gyrus −45 −54 −22 5.45

Left Middle temporal gyrus −52 −52 2 5.57Left Amygdala −20 −8 −20 4.59Right Precentral gyrus 42 4 38 4.1Right Middle frontal gyrus 40 14 28 4.75Left Inferior frontal gyrus −50 28 14 4.77Right Inferior frontal 52 30 8 4.02Right Frontal orbital cortex 44 26 −12 4.06Left Frontal orbital cortex −38 34 −18 3.67

Patients Left Lateral occipital cortex −60 −62 −8 4.58Left Fusiform gyrus −42 −52 −26 4.9Left Angular gyrus −50 −58 14 4.11Left Middle frontal gyrus −40 6 34 3.37Left Inferior frontal gyrus −52 16 28 3.58Left Inferior frontal gyrus −52 30 −4 2.6Left Inferior frontal gyrus −52 32 16 4.4

Emotion vs identityControls Right Lateral occipital cortex 46 −70 2 3.46

Left Lateral occipital cortex −54 −68 −2 3.24Right Fusiform gyrus 42 −66 −16 4.07Right Fusiform gyrus 42 −54 −20 3.71Left Fusiform gyrus −42 −48 −24 3.54Left Superior temporal gyrus −52 −40 2 3.24Right Middle temporal gyrus 52 −42 −2 3.52Left Middle temporal gyrus −48 −26 −8 3.34Left Inferior temporal gyrus −48 −58 −20 3.77Right Inferior temporal gyrus 52 −56 −12 3.05Right Inferior frontal gyrus 44 10 20 4.44Right Middle frontal gyrus 52 28 24 3.47Right Frontal orbital cortex 32 26 −4 3.42

Patients Right Precentral gyrus 50 10 26 3.17Left Inferior frontal gyrus −52 16 26 3.55Right Inferior frontal gyrus 52 16 14 3.76Right Insular cortex 36 20 −6 3.2Left Frontal operculum cortex −36 22 10 3.71Right Frontal operculum cortex 36 24 12 3.07Right Middle frontal gyrus 38 32 18 2.87Left Middle frontal gyrus −40 10 32 3.44Left Frontal pole −46 36 16 3.5

Emotion and identityControls Left Lateral occipital cortex −54 −68 10 4.07

Left Fusiform gyrus −42 −50 −20 5.36Left Middle temporal gyrus −50 −58 10 5.52Left Middle temporal gyrus −56 −56 2 4.86Left Hippocampus −30 −12 −20 4.52Right Hippocampus 28 −10 −20 4.88Left Amygdala −20 −8 −20 4.9Right Amygdala 22 −6 −20 5.18Left Precentral gyrus −34 2 36 3.59Right Precentral gyrus 40 4 34 4Left Middle frontal gyrus −40 6 36 3.78Right Middle frontal gyrus 40 14 30 4.14Left Inferior frontal gyrus −46 18 20 5.17Right Inferior frontal gyrus 46 20 24 4.12Right Frontal orbital cortex 46 28 −4 4.53Left Frontal orbital cortex −36 34 −14 3.33Right Frontal pole 44 38 −6 4.06Right Frontal pole 48 46 −6 3.69

Patients No significant activationsdetected

47J. Quintana et al. / Psychiatry Research: Neuroimaging 191 (2011) 44–50

statistical sensitivity. Data from the pattern matching (i.e., control)condition was not explicitly modeled, and therefore was considered asan implicit baseline condition. We computed four contrast images foreach run and subject: emotion matching, identity matching, emotionmatchingversus identitymatching, andemotionplus identitymatching.Second, to average across the three runs each subject performed, wecompleted a second-level analysis using a fixed-effects model, byforcing the random effect variance to zero in FLAME (FMRIB's LocalAnalysis ofMixed Effects) (Beckmannet al., 2003;Woolrich et al., 2004).Third, we performed a mixed-effects model (FLAME stage 1 only)(Beckmann et al., 2003;Woolrich et al., 2004) to characterize activationpatterns for each contrast of interest in each group separately and todirectly compare activations of patients to those of controls for eachcontrast of interest. Unless otherwise noted, we used a threshold z valueof 2.3 and a cluster probability of pb0.05, corrected for whole-brainmultiple comparisons using Gaussian random field theory.

3. Results

3.1. Behavioral performance

The behavioral performance results from both groups are shown inTable 1. We performed a three (condition) by two (group) repeatedmeasures ANOVA with condition as a within-subject factor and groupas a between-subject factor. The main effect of condition wassignificant (F(2,36)=7.637, pb0.01) but neither the main effect ofgroup nor the condition by group interaction was significant. Post-hocanalyses showed that both groups performed worse on the emotionmatching condition than on the identity or pattern matching ones(emotion matching vs. identity matching, pb0.01; emotion matchingvs. pattern matching, pb0.05; identity matching vs. pattern matching,p=0.20). Both the schizophrenia patients and healthy controlsubjects exhibited lower performance accuracy during emotionmatching than during the other two conditions.

3.2. Brain activations

Table 2 lists brain areas (local maxima of the clusters) thatexhibited above threshold activations for each contrast of interest inthe healthy control and schizophrenia patient groups. Overall,although the two subject groups showed distinctly different patternsof activation during different aspects of the SFIM task, direct statisticalcomparisons between the two groups did not render above thresholdstatistically significant differences.

3.2.1. Identity matchingAfter contrasting with the complex pattern discrimination (i.e.,

control) task, no statistically significant brain activation abovethreshold was found in either group during the Identity Matchingcondition.

3.2.2. Emotion matchingDuring the Emotion Matching condition, healthy controls showed

increased activations (relative to complex pattern discrimination) inbrain regions including the left temporo-occipital junction, left fusiformgyrus, left amygdala, right middle frontal gyri, bilateral inferior frontal

Table 1Behavioral performance (accuracy) of schizophrenia patients and healthy controls.

Schizophrenia patients Healthy controls

Emotion matching 0.86 (0.08) 0.91 (0.07)Identity matching 0.99 (0.02) 0.96 (0.12)Pattern matching 0.93 (0.04) 0.95 (0.04)

Values represent mean accuracy (standard deviation).

gyri and the orbital frontal cortex. Schizophrenia patients showed adifferent distributed pattern of activation, which included the leftinferior frontal gyrus, left middle frontal gyrus, left lateral fusiformgyrus, left occipital cortex and left angular gyrus.No activation related tofacial emotionmatchingwas observed in the amygdala in schizophreniapatients.

48 J. Quintana et al. / Psychiatry Research: Neuroimaging 191 (2011) 44–50

3.2.3. Emotion versus identity matchingWhen theEmotionMatching conditionwasdirectly compared to the

Identity Matching condition, statistically significant activations wereseen in healthy controls that included the bilateral temporo-occipitaljunction and anterior fusiform gyri, bilateral middle temporal gyri, leftsuperior temporal gyrus, right inferior and middle frontal gyrus, andright orbital frontal cortex (see Table 2 and Fig. 2). In schizophreniapatients, on the other hand, statistically significant activations wereobserved in only a far more reduced number of brain areas, includingbilateral middle and inferior frontal gyri, bilateral frontal cortexoperculum, and right insular cortex. No contrast activation related toselective processing of emotion versus identity facial information wasobserved in the fusiform gyrus or the temporo-occipital junction inschizophrenia patients.

3.2.4. Emotion matching and identity matchingWhen the BOLD signal contrasts between Emotion Matching and

PatternMatching and between IdentityMatching and PatternMatchingwere analyzed together, normal controls showed statistically significantrelative brain activations extending to the left lateral occipital gyrus, leftmiddle temporal gyrus, left anterior portion of fusiform gyrus, bilateralhippocampus, bilateral amygdalae, and bilateral middle and inferiorfrontal gyri. In sharp contrast, no statistically significant relativeactivation associated with contrast analyses between facial informationprocessing as a whole and complex pattern processing was observed inschizophrenia patients.

4. Discussion

In this study, we investigated neural activation patterns associatedwith facial information processing in schizophrenia patients andhealthy controls using fMRI. Our design intended to clarify whetherdeficits in facial affect processing in schizophrenia patients result fromimpaired decoding of emotional content of facial expressions or arecompounded by impairments in processing of basic, non-emotionalfacial information or even of complex visual stimuli at large.We found

Fig. 2. Brain activations during different aspects of facial information processing. Upper imagein signal intensity during emotion matching relative to pattern matching conditions in (A) coverlaid group analysis results of significant increases in signal intensity during emotionpatients. MNI coordinates including hemisphere distribution, voxel statistical significance a

that, when neural activation patterns related to the decoding ofemotional content of facial expressions were contrasted with thoserelated to the processing of basic facial features, healthy controls andschizophrenia patients exhibited remarkably distinct relative activa-tion patterns. Whereas controls as a group activated a distributedneural network that included extended areas of the fusiform gyrus, thetemporal cortex and the right prefrontal and orbital frontal cortex,schizophrenia patients activatedbilateral areas of theprefrontal cortexbut not the fusiform gyrus or the temporal cortex. No such differencebetween the two subject groups was found when contrasting neuralactivations related to processing non-emotional facial features versusthose related to processing complex visual stimuli.

We previously showed that schizophrenia patients fail to activatethe lateral fusiform gyruswhen processing facial information (Quintanaet al., 2003b), but couldn't determine whether such deficit was relatedtoprocessing of either emotionalornon-emotional features of faces. Ourfindings here suggest that dysfunctions in extended components of thevisual system—including the fusiform gyri and related, neighboringassociation cortex areas—in schizophrenia patients may be only orselectively observed when processing emotional components of facialinformationbut notwhen processingnon-emotional facial features. Ourobservation is consistentwith recent findings of comparable activationsof the fusiform gyrus in schizophrenia patients and healthy controlswhen processing non-emotional features of faces (Yoon et al., 2006).

We also found that, when contrasting neural activity related toprocessing emotional content of faces with that related to processingcomplex visual stimuli, healthy controls showed increased relativeactivations in a distributed network including the left fusiform gyrus,the left middle temporal gyrus, and the left amygdala, regions thathave been traditionally linked to affective face processing (Adolphset al., 1999; Critchley et al., 2000; Haxby et al., 2000). The samecontrast analysis in schizophrenia patients did not detect increasedrelative amygdalar activations, agreeing with findings of previousstudies (Gur et al., 2002; Williams et al., 2004; Schneider et al., 2006;Gur et al., 2007). In our previous study on facial informationprocessing in schizophrenia (Quintana et al., 2003b), we did not

s sections of brain templates with overlaid group analysis results of significant increasesontrols, and (B) schizophrenia patients. Lower images sections of brain templates withmatching versus identity matching conditions in (C) controls and (D) schizophreniand cluster extensions are described in Table 2.

49J. Quintana et al. / Psychiatry Research: Neuroimaging 191 (2011) 44–50

find differential activations in the amygdala between subject groups,or even amygdalar activations relative to the control task in healthycontrols. However, subjects in that study were asked to discriminateemotional information bymatching either pictures of faces or labels ofaffective states to a target face depicting affect (Quintana et al.,2003b). It has been suggested that activations of the amygdala duringfacial affect processing can be modulated by how such information ispresented or processed (Hariri et al., 2000). For example, a strongeractivation of the amygdala is observed during implicit emotionprocessing (e.g., matching emotion) than during explicit emotionprocessing (e.g., affect labeling) (Hariri et al., 2000). Hence, it ispossible that amygdalar activation differences between healthycontrols and schizophrenia patients could have been masked by theuse of affect labels in our previous study, whereas such differenceswere revealed here when using only face pictures during the EmotionMatching condition.

When contrasting neural activations related to processing of basicfacial features with those related to processing complex visual stimuli,neither subject group showed statistically significant activations. Thisfinding may appear surprising considering the previously describedrobust activation of the fusiform gyrus during facial informationprocessing (Kanwisher and Yovel, 2006). In our previous study(Quintana et al., 2003b) we did indeed observe increased fusiformgyrus activations in healthy controlswhen processing faces, regardlessof the specific task demand. In that study, however, the task used tocontrol for general visual processing involved matching simplerounded shapes (i.e., circles and ellipses). To control for general visualprocessing in the current study, instead of those simple visual stimuli,we asked participants to match the general pattern of complex visualstimuli that would impose similar processing demands as the basicfeatures of the human face yet did not involve facial features.Discriminating complex visual stimuli may be accomplished byprocessing their overall patterns (i.e., holistic or configural processing)instead of comparing individual components. Hence, both processingbasic facial features (i.e., Identity) and processing complex visualstimuli (i.e., Pattern) may have used a similar configural processingapproach and thus have similarly activated the fusiformgyri, known tobe linked to such holistic processing (Yovel and Kanwisher, 2005;Schiltz and Rossion, 2006). Future studies with a direct comparison offusiformactivations during processing of complex versus simple visualstimuli might be useful to further clarify these points.

Our findings indicate that schizophrenia patients rely mainly onfrontal cortex resources tomake judgments on affective information offaces, whereas healthy controls utilize a distributed neural networkadditionally involving the fusiform gyri and amygdala for suchmentaltask. The prefrontal cortex is involved in increasing cognitive efforts orattentional control (MacDonald et al., 2000). Increased prefrontalcortex activation during emotion processing tasks can be observedespecially when such tasks require emotional appraisal or attending toemotional stimuli in general (Drevets andRaichle, 1998;Ochsner et al.,2002; Cunningham et al., 2004). Older adults show increasedprefrontal cortex activation compared with young adults whenviewing emotional faces (Gunning-Dixon et al., 2003). It is possiblethat schizophrenia patients in our study, despite comparable behav-ioral performance, found the emotional processing component of ourtask more demanding than our healthy controls. Whether a result ofillness-related subjective difficulties or of described neural abnormal-ities (Quintana et al., 2003a,b), this would have resulted in theirutilizing a different cognitive strategy when processing emotionalinformation. Patients might have processed emotional informationusing other—compensatory,wemight speculate—mechanisms insteadof relying on habitual structures such as the amygdala. Compensatorymechanisms have been described before—mostly in the prefrontal andposterior parietal cortices as well as in themirror premotor andmotorsystems—in schizophrenia patients performing working memorytasks involving affective facial information (Quintana et al., 2001,

2003a). The absence of statistically significant differences whendirectly comparing the two subject groups, however, precludes ourdata from being conclusive in this regard.

To summarize, we found distinctly different patterns of neuralactivations during facial information processing in schizophreniapatients and healthy controls. The differences appear mostly relatedto the emotional decoding component of the task and include lack ofactivations in the amygdala and the fusiform gyrus and increasedactivations in the prefrontal cortex in schizophrenia patients. Hence,we may conclude that neural dysfunctions associated with impairedfacial affective processing in schizophrenia occur in relation todecoding facial emotional cues but not in relation to encoding non-emotional facial features or processing complex visual stimuli ingeneral. Such conclusion should be however tempered by somelimitations of our study. First, the number of subjects in each group, thevisually complex nature of our control task—otherwise necessary torule out confounding overall visual processing deficits in schizophre-nia, and the use of multiple conditions may have resulted in a lack ofstatistically significant differences above threshold when relativeactivations in the two groupswere compared directly. Second, becausewe used a block design due to its inherent output advantages, it wasnot possible to directly examine instance-to-instance correlationsbetween behavioral performance and neural activations. Nonetheless,our results further confirm that schizophrenia patients may usedifferent neural circuits than healthy controls when processing theemotional expression of faces.

Acknowledgements

This research was partially supported by a Merit Review Awardfrom the Veterans Administration Medical Research Service to J.Quintana. We would like to thank Jonathan Osowsky and Nicolle Mafor their helpful assistance.

We also thank the Brain Mapping Medical Research Organization,Brain Mapping Support Foundation, Pierson-Lovelace Foundation, TheAhmanson Foundation, William M. and Linda R. Dietel PhilanthropicFund at the Northern Piedmont Community Foundation, TamkinFoundation, Jennifer Jones-Simon Foundation, Capital GroupCompaniesCharitable Foundation, Robson Family, and Northstar Fund for theirgenerous support of the UCLA Brain Mapping Center.

References

Addington, J., Addington, D., 1998. Facial affect recognition and information processingin schizophrenia and bipolar disorder. Schizophrenia Research 32 (3), 171–181.

Adolphs, R., Tranel, D., Hamann, S., Young, A.W., Calder, A.J., Phelps, E.A., Anderson, A.,Lee, G.P., Damasio, A.R., 1999. Recognition of facial emotion in nine individuals withbilateral amygdala damage. Neuropsychologia 37 (10), 1111–1117.

Beckmann, C.F., Jenkinson, M., Smith, S.M., 2003. General multilevel linear modeling forgroup analysis in FMRI. Neuroimage 20 (2), 1052–1063.

Bentin, S., Allison, T., Pruce, A., Perez, E., McCarthy, G., 1996. Electrophysiological studiesof face perception in humans. Journal of Cognitive Neuroscience 8, 551–565.

Brekke, J., Kay, D.D., Lee, K.S., Green, M.F., 2005. Biosocial pathways to functionaloutcome in schizophrenia. Schizophrenia Research 80 (2–3), 213–225.

Bruce, V., Young, A., 1986. Understanding face recognition. British Journal of Psychology77, 305–327.

Butler, P.D., Schechter, I., Zemon, V., Schwartz, S.G., Greenstein, V.C., Gordon, J.,Schroeder, C.E., Javitt, D.C., 2001. Dysfunction of early-stage visual processing inschizophrenia. The American Journal of Psychiatry 158 (7), 1126–1133.

Chen, Y., Norton, D., McBain, R., Ongur, D., Heckers, S., 2009. Visual and cognitiveprocessing of face information in schizophrenia: detection, discrimination andworking memory. Schizophrenia Research 107 (1), 92–98.

Critchley, H.D., Daly, E., Phillips, M., Brammer, M., Bullmore, E., Williams, S., VanAmelsvoort, T., Robertson, D., David, A., Murphy, D., 2000. Explicit and implicitneural mechanisms for processing social information from facial expressions: afunctionalmagnetic resonance imaging study. Human BrainMapping 9 (2), 93–105.

Cunningham, W.A., Raye, C.L., Johnson, M.K., 2004. Implicit and explicit evaluation:fMRI correlates of valence, emotional intensity, and control in the processing ofattitudes. Journal of Cognitive Neuroscience 16, 1717–1729.

Drevets,W.C., Raichle,M.E., 1998. Reciprocal suppression of regional cerebral bloodflowduring emotional versus higher cognitive processes: implications for interactionbetween emotion and cognition. Cognition and Emotion 12, 353–385.

50 J. Quintana et al. / Psychiatry Research: Neuroimaging 191 (2011) 44–50

Edwards, J., Pattison, P.E., Jackson, H.J., Wales, R.J., 2001. Facial affect and affective prosodyrecognition in first-episode schizophrenia patients. Schizophrenia Research 48,235–253.

Eimer, M., 2000. The face-specific N170 component reflects late stages in the structuralencoding of faces. Journal of Cognitive Neuroscience 11, 2319–2324.

Ekman, P., 2004. Subtle Expression Training Tool (SETT) & Micro Expression TrainingTool (METT) (Version Paul Ekman). www.paulekman.com.

Fakra, E., Salgado-Pineda, P., Delaveau, P., Hariri, A.R., Blin, O., 2008. Neural bases of differentcognitive strategies for facial affect processing in schizophrenia. SchizophreniaResearch100 (1–3), 191–205.

First, M.B., Spitzer, R.L., Gibbon, M., Williams, J.B.W., 1997. Structured Clinical Interviewfor DSM-IV Axis I Disorders—Patient Edition. New York State Psychiatric Institute,New York, NY.

Gunning-Dixon, F.M., Gur, R.C., Perkins, A.C., Schroeder, L., Turner, T., Turetsky, B.I.,Chan, R.M., Loughead, J.W., Alsop, D.C., Maldjian, J., Gur, R.E., 2003. Age-relateddifferences in brain activation during emotional face processing. Neurobiology ofAging 24 (2), 285–295.

Gur, R.E., McGrath, C., Chan, R.M., Schroeder, L., Turner, T., Turetsky, B.I., Kohler, C.,Alsop, D., Maldjian, J., Ragland, J.D., Gur, R.C., 2002. An fMRI study of facial emotionprocessing in patients with schizophrenia. The American Journal of Psychiatry 159(12), 1992–1999.

Gur, R.E., Longhead, J., Kohler, C.G., Elliott,M.A., Lesko, K., Ruparel,K.,Wolf, D.H., Bilker,W.B.,Gur, R.C., 2007. Limbic activation associated withmisidentification of fearful faces andflat affect in schizophrenia. Archives of General Psychiatry 64 (12), 1356–1366.

Hariri, A.R., Bookheimer, S.Y., Mazziotta, J.C., 2000. Modulating emotional responses:effects of a neocortical network on the limbic system. NeuroReport 11 (1), 43–48.

Hariri, A.R., Mattay, V.S., Tessitore, A., Fera, F., Weinberger, D.R., 2003. Neocorticalmodulation of the amygdala response to fearful stimuli. Biological Psychiatry 53 (6),494–501.

Haxby, J.V., Hoffman, E.A., Gobbini, M.I., 2000. The distributed human neural system forface perception. Trends in Cognitive Sciences 4, 223–233.

Ihnen, G.H., Penn, D.L., Corrigan, P.W., Martin, J., 1998. Social perception and social skillsin schizophrenia. Psychiatry Research 80, 275–286.

Jenkinson, M., Smith, S., 2001. A global optimisation method for robust affineregistration of brain images. Medical Image Analysis 5, 143–156.

Jenkinson, M., Bannister, P., Brady, M., Smith, S., 2002. Improved optimisation for therobust and accurate linear registration and motion correction of brain images.Neuroimage 17 (2), 825–841.

Kanwisher, N., Yovel, G., 2006. The fusiform face area: a cortical region specialized forthe perception of faces. Philosophical Transactions of the Royal Society of London.Series B: Biological Sciences 361 (1476), 2109–2128.

Kawasaki, H., Kaufman, O., Damasio, H., Damasio, A.R., Granner, M., Bakken, H., Hori, T.,Howard 3rd, M.A., Adolphs, R., 2001. Single-neuron responses to emotionally visualstimuli recorded in human ventral prefrontal cortex. Nature Neuroscience 4 (1),15–16.

Kee, K.S., Green, M.F., Mintz, J., Brekke, J.S., 2003. Is emotion processing a predictor offunctional outcome in schizophrenia? Schizophrenia Bulletin 29 (3), 487–497.

Lee, S.H., Kim, E.Y., Kim, S., & Bae, S.M., 2010. Event-related potential patterns andgender effects underlying facial affect processing in schizophrenia patients.Neuroscience Research 67 (2), 172–180.

Lynn, S.K., Salisbury, D.F., 2008. Attenuated modulation of the N170 ERP by facialexpressions in schizophrenia. Clinical EEG and Neuroscience 39 (2), 108–111.

MacDonald 3rd, A.W., Cohen, J.D., Stenger, V.A., Carter, C.S., 2000. Dissociating the roleof the dorsolateral prefrontal and anterior cingulate cortex in cognitive control.Science 288 (5472), 1835–1838.

Mandal, M.K., Pandey, R.P., Prasad, A.B., 1998. Facial expressions of emotions andschizophrenia: a review. Schizophrenia Bulletin 24, 399–412.

Ochsner, K.N., Bunge, S.A., Gross, J.J., Gabrieli, J.D., 2002. Rethinking feelings: an fMRIstudy of the cognitive regulation of emotion. Journal of Cognitive Neuroscience 14,1215–1229.

Oldfield, R.C., 1971. The assessment and analysis of handedness: the Edinburghinventory. Neuropsychologia 9 (1), 97–113.

Onitsuka, T., Niznikiewicz, M.A., Spencer, K.M., Frumin, M., Kuroki, N., Lucia, L.C.,Shenton, M.E., McCarley, R.W., 2006. Functional and structural deficits in brainregions subserving face perception in schizophrenia. The American Journal ofPsychiatry 163 (3), 455–462.

Quintana, J., Davidson, T., Kovalik, E., Marder, S.R., Mazziotta, J.C., 2001. Acompensatory mirror cortical mechanism for facial affect processing inschizophrenia. Neuropsychopharmacology 25, 915–924.

Quintana, J., Wong, T., Ortiz-Portillo, E., Kovalik, E., Davidson, T., Marder, S.R.,Mazziotta, J.C., 2003a. Prefrontal–posterior parietal networks in schizophrenia:primary dysfunctions and secondary compensations. Biological Psychiatry 53 (1),12–24.

Quintana, J., Wong, T., Ortiz-Portillo, E., Marder, S.R., Mazziotta, J.C., 2003b. Right lateralfusiform gyrus dysfunction during facial information processing in schizophrenia.Biological Psychiatry 53, 1099–1122.

Schiltz, C., Rossion, B., 2006. Faces are represented holistically in the human occipito-temporal cortex. Neuroimage 32, 1385–1394.

Schneider, F., Gur, R.C., Koch, K., Backes, V., Amunts, K., Shah, N.J., Bilker, W., Gur, R.E.,Habel, U., 2006. Impairment in the specificity of emotion processing inschizophrenia. The American Journal of Psychiatry 163 (3), 442–447.

Smith, S.M., 2002. Fast robust automated brain extraction. Human Brain Mapping 17,143–155.

Smith, S.M., Jenkinson, M., Woolrich, M.W., Beckmann, C.F., Behrens, T.E., Johansen-Berg, H., Bannister, P.R., De Luca, M., Drobnjak, I., Flitney, D.E., Niazy, R.K., Saunders,J., Vickers, J., Zhang, Y., De Stefano, N., Brady, J.M., Matthews, P.M., 2004. Advancesin functional and structural MR image analysis and implementation as FSL.Neuroimage 23, S208–219.

Streit, M., Ioannides, A.A., Liu, L., Wölwer, W., Dammers, J., Gross, J., Gaebel, W., Müller–Gärtner, H.W., 1999. Neurophysiological correlates of the recognition of facialexpressions of emotion as revealed by magnetoencephalography. Brain Research:Cognitive Brain Research 7, 481–491.

Streit, M., Ioannides, A., Sinnemann, T., Wolwer, W., Dammers, J., Zilles, K., Gaebel, W.,2001. Disturbed facial affect recognition in patients with schizophrenia associatedwith hypoactivity in distributed brain regions: a magnetoencephalographic study.The American Journal of Psychiatry 158 (9), 1429–1436.

Turetsky, B.I., Kohler, C.G., Indersmitten, T., Bhati, M.T., Charbonnier, D., Gur, R.C., 2007.Facial emotion recognition in schizophrenia: when and why does it go awry?Schizophrenia Research 94 (1–3), 253–263.

Williams, L.M., Das, P., Harris, A.W., Liddell, B.B., Brammer, M.J., Olivieri, G., Skerrett, D.,Phillips, M.L., David, A.S., Peduto, A., Gordon, E., 2004. Disruglation of arousal andamygdala-prefrontal systems in paranoid schizophernia. The American Journal ofPsychiatry 159 (12), 1992–1999.

Woolrich,M.W., Behrens, T.E., Beckmann, C.F., Jenkinson,M., Smith, S.M., 2004.Multi-levellinearmodeling for FMRI group analysis using Bayesian inference. Neuroimage 21 (4),1732–1747.

Wynn, J.K., Lee, J., Horan, W.P., Green, M.F., 2008. Using event-related potentials toexplore stages of facial affect recognition deficits in schizophrenia. SchizophreniaBulletin 34 (4), 679–687.

Yoon, J.H., D'Esposito, M., Carter, C.S., 2006. Preserved function of the fusiform face area inschizophrenia as revealed by fMRI. Psychiatry Research: Neuroimaging 148 (2–3),205–216.

Yovel, G., Kanwisher, N., 2005. The neural basis of the behavioral face-inversion effect.Current Biology 15, 2256–2262.