aglioti&pazzaglia tics sounds scents social action 2011

9
Sounds and scents in (social) action Salvatore M. Aglioti 1, 2 and Mariella Pazzaglia 1, 2 1 Dipartimento di Psicologia, Sapienza University of Rome, Via dei Marsi 78, Rome I-00185, Italy 2 IRCCS Fondazione Santa Lucia, Via Ardeatina 306, Rome I-00179, Italy Although vision seems to predominate in triggering the simulation of the behaviour and mental states of others, the social perception of actions might rely on auditory and olfactory information not only when vision is lacking (e.g. in congenitally blind individuals), but also in daily life (e.g. hearing footsteps along a dark street prompts an appropriate fight-or-fly reaction and smelling the scent of coffee prompts the act of grasping a mug). Here, we review recent evidence showing that non-visual, telereceptor-mediated motor mapping might occur as an autonomous process, as well as within the context of the multimodal perceptions and representations that characterize real-world experiences. Moreover, we dis- cuss the role of auditory and olfactory resonance in anticipating the actions of others and, therefore, in shaping social interactions. Telereceptive senses, namely vision, audition and olfaction Perceiving and interacting with the world and with other individuals might appear to be guided largely by vision, which, according to classical views, leads over audition, olfaction and touch, and commands, at least in human and non-human primates, most types of cross-modal and perceptuo-motor interactions [1]. However, in sundry daily life circumstances, our experience with the world is inher- ently cross-modal [2]. For example, inputs from all sensory channels combine to increase the efficiency of our actions and reactions. Seeing flames, smelling smoke or hearing a fire alarm might each be sufficient to create an awareness of a fire. However, the combination of all these signals ensures that our response to danger is more effective. The multi- modal processing of visual, acoustic and olfactory informa- tion is even more important for our social perception of the actions of other individuals [3]. Indeed, vision, audition and olfaction are the telereceptive senses that process informa- tion coming from both the near and the distant external environment, on which the brain then defines the selfother border and the surrounding social world [4,5]. Behavioural studies suggest that action observation and execution are coded according to a common representa- tional medium [6]. Moreover, neural studies indicate that seeing actions activates a fronto-parietal neural network that is also active when performing those same actions [7,8]. Thus, the notion that one understands the actions of others by simulating them motorically is based mainly on visual studies (Box 1). Vision is also the channel used for studying the social nature of somatic experiences (e.g. touch and pain) [911] and emotions (e.g. anger, disgust and happiness) [12]. In spite of the notion that seeing might be informed by what one hears or smells, less is known about the possible mapping of actions through the sound and the odour associated with them, either in the absence of vision or within the context of clear cross-modal perception. In this review, we question the exclusive supremacy of vision in action mapping, not to promote a democracy of the senses, but to highlight the crucial role of the other two telereceptive channels in modulating our actions and our understanding of the world in general, and of the social world in particular. The sound and flavour of actions Classic cross-modal illusions, such as ventriloquism or the McGurk effect, indicate that vision is a key sense in several circumstances [13,14]. Therefore, when multisensory cues are simultaneously available, humans display a robust tendency to rely more on visual than on other forms of sensory information, particularly when dealing with spatial tasks (a phenomenon referred to as the ‘Colavita visual dominance effect’) [15]. However, our knowledge is some- times dominated by sound and is filtered through a predom- inantly auditory context. Auditory stimuli might, for example, capture visual stimuli in temporal localization tasks [16]. Moreover, the presentation of two beeps and a single flash induces the perception of two visual stimuli [17]. Thus, sound-induced flash illusions create the mistaken belief that we are seeing what we are, in fact, only hearing. This pattern of results might be in keeping with the notion that multisensory processing reflects ‘modality ap- propriateness’ rules, whereby vision dominates in spatial tasks, and audition in temporal ones [18]. However, psy- chophysical studies indicate that the degradation of visual inputs enables auditory inputs to modulate spatial locali- zation [19]. This result is in keeping with the principle of inverse effectiveness [20], according to which multisensory integration is more probable or stronger for the unisensory stimuli that evoke relatively weak responses when pre- sented in isolation. Notably, the recording of neural activi- ty from the auditory cortex of alert monkeys watching naturalistic audiovisual stimuli indicates that not only do congruent bimodal events provide more information than do unimodal ones, but also that suppressed responses are also less variable and, thus, more informative than are enhanced responses [21]. Relevant to the present review is that action sounds might be crucial for signalling socially dangerous or un- pleasant events. Efficient mechanisms for matching audi- tion with action might be important, even at basic levels, because they might ensure the survival of all hearing Review Corresponding author: Aglioti, S.M. ([email protected]). 1364-6613/$ see front matter ß 2010 Elsevier Ltd. All rights reserved. doi:10.1016/j.tics.2010.12.003 Trends in Cognitive Sciences, February 2011, Vol. 15, No. 2 47

Upload: kevin-bickart

Post on 11-Mar-2015

22 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Aglioti&Pazzaglia TICS Sounds Scents Social Action 2011

Sounds and scents in (social) actionSalvatore M. Aglioti1,2 and Mariella Pazzaglia1,2

1 Dipartimento di Psicologia, Sapienza University of Rome, Via dei Marsi 78, Rome I-00185, Italy2 IRCCS Fondazione Santa Lucia, Via Ardeatina 306, Rome I-00179, Italy

Review

Although vision seems to predominate in triggering thesimulation of the behaviour and mental states of others,the social perception of actions might rely on auditoryand olfactory information not only when vision is lacking(e.g. in congenitally blind individuals), but also in dailylife (e.g. hearing footsteps along a dark street promptsan appropriate fight-or-fly reaction and smelling thescent of coffee prompts the act of grasping a mug). Here,we review recent evidence showing that non-visual,telereceptor-mediated motor mapping might occur asan autonomous process, as well as within the context ofthe multimodal perceptions and representations thatcharacterize real-world experiences. Moreover, we dis-cuss the role of auditory and olfactory resonance inanticipating the actions of others and, therefore, inshaping social interactions.

Telereceptive senses, namely vision, audition andolfactionPerceiving and interacting with the world and with otherindividuals might appear to be guided largely by vision,which, according to classical views, leads over audition,olfaction and touch, and commands, at least in humanand non-human primates, most types of cross-modal andperceptuo-motor interactions [1]. However, in sundry dailylife circumstances, our experience with the world is inher-ently cross-modal [2]. For example, inputs from all sensorychannels combine to increase the efficiency of our actionsand reactions. Seeing flames, smelling smoke or hearing afire alarmmight each be sufficient to create an awareness ofa fire. However, the combination of all these signals ensuresthat our response to danger is more effective. The multi-modal processing of visual, acoustic and olfactory informa-tion is even more important for our social perception of theactions of other individuals [3]. Indeed, vision, audition andolfaction are the telereceptive senses that process informa-tion coming from both the near and the distant externalenvironment, onwhich the brain then defines the self–otherborder and the surrounding social world [4,5].

Behavioural studies suggest that action observation andexecution are coded according to a common representa-tional medium [6]. Moreover, neural studies indicate thatseeing actions activates a fronto-parietal neural networkthat is also active when performing those same actions[7,8]. Thus, the notion that one understands the actions ofothers by simulating them motorically is based mainly onvisual studies (Box 1). Vision is also the channel used forstudying the social nature of somatic experiences (e.g.touch and pain) [9–11] and emotions (e.g. anger, disgust

Corresponding author: Aglioti, S.M. ([email protected]).

1364-6613/$ – see front matter � 2010 Elsevier Ltd. All rights reserved. doi:10.1016/j.tics.2010.1

and happiness) [12]. In spite of the notion that seeingmight be informed by what one hears or smells, less isknown about the possible mapping of actions through thesound and the odour associated with them, either in theabsence of vision or within the context of clear cross-modalperception. In this review, we question the exclusivesupremacy of vision in action mapping, not to promote ademocracy of the senses, but to highlight the crucial role ofthe other two telereceptive channels in modulating ouractions and our understanding of the world in general, andof the social world in particular.

The sound and flavour of actionsClassic cross-modal illusions, such as ventriloquism or theMcGurk effect, indicate that vision is a key sense in severalcircumstances [13,14]. Therefore, when multisensory cuesare simultaneously available, humans display a robusttendency to rely more on visual than on other forms ofsensory information, particularlywhen dealingwith spatialtasks (a phenomenon referred to as the ‘Colavita visualdominance effect’) [15]. However, our knowledge is some-times dominated by sound and is filtered through a predom-inantly auditory context. Auditory stimuli might, forexample, capture visual stimuli in temporal localizationtasks [16]. Moreover, the presentation of two beeps and asingle flash induces the perception of two visual stimuli [17].Thus, sound-induced flash illusions create the mistakenbelief that we are seeing what we are, in fact, only hearing.

This pattern of results might be in keeping with thenotion that multisensory processing reflects ‘modality ap-propriateness’ rules, whereby vision dominates in spatialtasks, and audition in temporal ones [18]. However, psy-chophysical studies indicate that the degradation of visualinputs enables auditory inputs to modulate spatial locali-zation [19]. This result is in keeping with the principle ofinverse effectiveness [20], according to which multisensoryintegration is more probable or stronger for the unisensorystimuli that evoke relatively weak responses when pre-sented in isolation. Notably, the recording of neural activi-ty from the auditory cortex of alert monkeys watchingnaturalistic audiovisual stimuli indicates that not onlydo congruent bimodal events provide more informationthan do unimodal ones, but also that suppressed responsesare also less variable and, thus, more informative than areenhanced responses [21].

Relevant to the present review is that action soundsmight be crucial for signalling socially dangerous or un-pleasant events. Efficient mechanisms for matching audi-tion with action might be important, even at basic levels,because they might ensure the survival of all hearing

2.003 Trends in Cognitive Sciences, February 2011, Vol. 15, No. 2 47

Page 2: Aglioti&Pazzaglia TICS Sounds Scents Social Action 2011

Box 1. Beyond the visuomotor mirror system

Mirror neurons (MNs), originally discovered in the monkey ventral

premotor cortex (F5) and inferior parietal lobe (PFG), increase their

activity during action execution as well as during viewing of the

same action [68,69]. Single-cell recording from the ventral premotor

cortex showed that MNs fired also when sight of the hand–object

interaction was temporarily occluded [70]. In a similar way, the

activity in the parietal MNs of the onlooking monkey was modulated

differentially when the model exhibited different intentions (e.g.

grasping the same object to eat or to place it) [71]. Taken together,

these results suggest that MNs represent the observed actions

according to anticipatory codes. Relevant to the present review is

the existence of audio-motor MNs specifically activated when the

monkey hears the sound of a motor act without seeing or feeling it

[72]. In addition, the multimodal response of visuo-audio-motor

neurons might be superadditive; that is, stronger than the sum of

the unimodal responses [73]. Whereas audio MNs might underpin

an independent and selective mapping modality [72], triple-duty

neurons are likely to constitute the neural substrate of the complex

multimodal mapping of actions [73]. Therefore, the physiological

properties of these resonant neurons suggest they constitute a core

mechanism for representing the actions of others.

In a recent study, single-cell recording was conducted in human

patients who observed and performed emotional and non-emo-

tional actions. The study provides direct evidence of double-duty

visuo-motor neurons, possibly coding for resonant emotion and

action [74]. Importantly, the human ‘see-do’ neurons were found in

the medial frontal and temporal cortices (where the patients, for

therapeutic reasons, had electrodes implanted). These two regions

are not part of the classic mirror system, suggesting that the

onlooker-model resonance extends beyond action mirroring and the

premotor-parietal network. Direct information relating to the non-

visual and anticipatory properties of the human mirror system is,

however, still lacking.

Review Trends in Cognitive Sciences February 2011, Vol. 15, No. 2

individuals. For example, in the dark of the primordialnights, ancestral humans probably detected potential dan-gers (e.g. the footsteps of enemies) mainly by audition and,therefore, implemented effective fight-or-flight behaviour.However, action–sound mediated inferences about othersmight also occur in several daily life circumstances inpresent times. Imagine, for example, your reaction tothe approach of heavy footsteps when you are walkingalong a dark street. Furthermore, listening to footstepsof known individuals might enable one to not only recog-nize the identity [22], but also determine the disposition ofthese individuals (e.g. bad mood).

Although olfaction in some mammals mediates sophisti-cated social functions, such as mating, and might facilitatethe recognition of ‘who is who’ [23], this sense is consideredsomewhat reductional in humans. However, even inhumans, olfaction is closely related to not only neurovege-tative and emotional reactivity, but also higher-order func-tions, suchasmemory.Moreover, olfaction inhumans isalsolinked to empathic reactivity [24], kin recognition [25],cross-modal processing of the faces of others and the con-struction of the semantic representation of objects [26].Behavioural studies indicate that the grasping of small(e.g. an almond) or large (e.g. an apple) objects with charac-teristic odours is influenced by the delivery of the same or ofdifferent smells. In particular, a clear interference with thekinematics of grasping [27] and reaching [28] movementswas found in conditions of mismatch between the observedobjects (e.g. a strawberry) and the odour delivered duringthe task (e.g. the scent of an orange).

48

Mapping sound- and odour-related actions in thehuman brainInspired by single-cell recording in monkeys [7], manyneuroimaging and neurophysiological studies suggest thatthe adult human brain is equipped with neural systemsand mechanisms that represent visual perception and theexecution of action in common formats. Moreover, studiesindicate that a large network, centred on the inferiorfrontal gyrus (IFG) and the inferior parietal lobe (IPL),and referred to as the action observation network (AON)[29,30], underpins action viewing and action execution.Less information is available about whether the AON[31] is also activated by the auditory and olfactory codingof actions.

The phenomena, mechanisms and neural structuresinvolved in processing action-related sounds have beenexplored in healthy subjects (Figure 1) and in brain-dam-aged individuals (Box 2) using correlational [32–36] andcausative approaches [37]. At least two important conclu-sions can be drawn from these studies. The first is thatlistening to the sound produced by human body parts (e.g.two hands clapping) activates the fronto-parietal AON.The second is that such activation might be somatotopi-cally organized, with the left dorsal premotor cortex andthe IPL being more responsive to the execution and hear-ing of hand movements than to mouth actions or to soundsthat are not associated with human actions (e.g. environ-mental sounds, a phase-scrambled version of the samesound, or a silent event). Conversely, the more ventralregions of the left premotor cortex are more involved inprocessing sounds performed by the mouth (Figure 1 andBox 2).

The social importance of olfaction in humans has beendemonstrated in a positron emission tomography (PET)study [38], showing that body odours activate a set ofcortical regions that differed from those activated bynon-body odours. In addition, smelling the body odour ofa friend activates different neural regions (e.g. Extrastri-ate body area (EBA)) from smelling the odour of strangers(e.g. amygdala and insula). However, interest in the olfac-tory coding of actions and its neural underpinnings is veryrecent, and only two correlational studies have addressedthis topic thus far (Figure 2). In particular,mere perceptionof smelling food objects induced both a specific facilitationof the corticospinal system [39] and specific neural activityin the AON [40].

Multimodal coding of actions evoked by auditory andolfactory cuesThe inherently cross-modal nature of action perception issupported by evidence showing that a combination ofmultiple sensory channels might enable individuals tointerpret actions better. The merging of visual and audito-ry information, for example, enables individuals to opti-mize their perceptual and motor behaviour [41]. Moreover,a combination of olfactive and visual inputs facilitates theselection of goal-directed movements [42]. Importantly,although auditory or olfactory cues might increase neuralactivity in action-related brain regions, such effectsmight be higher when the two modalities are combined.It has been demonstrated, for example, that the blood

Page 3: Aglioti&Pazzaglia TICS Sounds Scents Social Action 2011

[()TD$FIG]

BA 6 IPL

0

1

2

3

4

5

6

7

8

-100 0 100 200 300 400 500

Glo

bal f

ield

pow

er

Latency (ms) Latency (ms)

(a)

(b)

(c)

0

1

2

3

4

5

6

7

8

-100 0 100 200 300 400 500 600 700 800S

igna

l-to-

nois

e ra

tios

FCz0 500

5 µV

mmn mouthmmn hand

mmn mouth - hand

0.12 nA/cm²

-0.12 nA/cm²

0.5 µV

Action evoking sounds Non-action evoking sounds

ME

P a

mpl

itude

(% v

s. b

asel

ine)

ME

P a

mpl

itude

(% v

s. b

asel

ine)

Unimodal Bimodal

0 s

Incongruent CongruentCongruentKey: Incongruent

Unimodal visual

Bimodal Unimodal auditory

Bimodal

Unimodal sum

Multimodal congruent

Multimodalincongruent

visual

Multimodal incongruent

auditory

-10

0

10

20

30

40

50

-10

0

10

20

30

40

50

6 s

1° trial

4 s TMS

4.8 s

2° trial

05

101520253035

Motor

Mirror overlap

Action sound

Hand Mouth Overlap

Auditory-mirror Audio visual mirror

Auditory

Silence

Mouth action

Hand action

Environmental

Scrambled mouth action

Scrambled hand action

HandKey:

Key:

Mouth

-101234567

-101234567

-2

-1

0

1

2BA 44

Bol

d si

gnal

Bol

d si

gnal

Bol

d si

gnal

Sound perception

Sound execution

Visual perception

Sound perception

Sound execution

Visual perception

Sound perception

Sound execution

Visual perception

Key: Key:Key:

Key:

TRENDS in Cognitive Sciences

Water

Crush Crush

Crush

Figure 1. The sound of actions. Representative studies on the auditory mapping of actions, performed by using different state-of-the-art cognitive neuroscience techniques.

(a) Left panel: cortical activity evoked by listening to sounds associated with human finger (red line) and tongue (blue line) movements that were used as deviant stimuli to

evoke a potential known as mismatch negativity (MMN). Sounds not associated with any actions were used as control stimuli. Deviant stimuli produced larger MMNs than

did sounds not already associated with actions 100 ms after the stimulus presentation. Furthermore, the source estimation of MMN indicates that finger and tongue sounds

activated distinct regions in the left pre-motor areas, suggesting an early automatic, somatotopic mapping of action-related sounds in these regions. Right panel: auditory-

evoked potentials in response to context-related sounds that typically cue a responsive action by the listener (e.g. a ringing telephone; red line) and to context-free sounds

that do not elicit responsive actions (e.g. a ringing tower bell; blue line). Responses were higher for action-evoking sounds than for non-action-evoking sounds 300 ms after

the stimulus, mainly in the left premotor and inferior frontal and prefrontal regions [32,33]. (b) Hearing sounds related to human actions increases neural activity in left

perisylvian fronto-parietal areas relative to hearing environmental sounds, a phase-scrambled version of the same sound, or a silent event. In the frontal cortex, the pattern

of neural activity induced by action-related sounds reflected the body part evoked by the sound heard. A dorsal cluster was more involved during listening to and executing

hand actions, whereas a ventral cluster was more involved during listening to and executing mouth actions. Thus, audio-motor mapping might occur according to

somatotopic rules. The audio-motor mirror network was also activated by the sight of the heard actions, thus hinting at the multimodal nature of action mapping [34]. (c)

Single pulse TMS enables the exploration of the functional modulation of the corticospinal motor system during visual or acoustic perception of actions. During unimodal

presentations, participants observed a silent video of a right hand crushing a small plastic bottle or heard the sound of a bottle being crushed. During bimodal conditions,

vision and auditory stimuli were congruent (seeing and hearing a hand crushing a bottle; blue lines and bars) or incongruent (e.g. seeing a hand crushing a bottle but

hearing the sound of water being poured in a glass and hearing the sound of a hand crushing a bottle but seeing a foot crushing a bottle; red lines and bars). Compared with

incongruent bimodal stimulation, unimodal and congruent bimodal stimulation induced an increase of amplitude of the motor potentials evoked by the magnetic pulse.

Thus, corticospinal reactivity is a marker of both unimodal and cross-modal mapping of actions [35]. Data adapted, with permission, from [32–35].

Review Trends in Cognitive Sciences February 2011, Vol. 15, No. 2

oxygenation level-dependent (BOLD) signal in the leftventral premotor cortex is enhanced when seeing andhearing another individual tearing paper as comparedwith viewing a silent video depicting the same scene oronly hearing the sound associated with the observed action[43]. No such dissociation was found for the parietalregions, indicating that cross-modal modulation mightdifferentially impact on the different nodes of the AON.Similarly, corticospinal motor activity in response to theacoustic presentation of the sound of a hand crushing a

small bottle was lower than to the presentation of congru-ent visuo-acoustic input (e.g. the same sound and thecorresponding visual scene), and higher than to incongru-ent visuo-acoustic information (e.g. the same sound and ahand pouring water from the same bottle) [35] (Figure 1c).This pattern of results hints at a genuine, cross-modalmodulation of audiomotor resonance [31]. Neurophysiolog-ical studies have identified multisensory neurons in thesuperior temporal sulcus (STS) that code both seen andheard actions [21]. When driven by audiovisual bimodal

49

Page 4: Aglioti&Pazzaglia TICS Sounds Scents Social Action 2011

Box 2. Audio-motor resonance in patients with apraxia

Crucially, causative information on the auditory mapping of actions

has been provided by a study on patients with apraxia [37], where a

clear association was identified between deficits in performing hand-

or mouth-related actions and the ability to recognize the associated

sounds. Moreover, using state-of-the-art lesion-mapping proce-

dures, it was shown that, whereas both frontal and parietal

structures are involved in executing actions and discriminating the

sounds produced by the actions of others, the ability to recognize

specifically sounds arising from non-human actions appears to be

linked to the temporal regions (Figure Ia). This finding supports the

notion that different neural substrates underpin the auditory

mapping of actions and the perception of non-human action-related

cues. Because this study was based on a sound-picture matching

task, it is, in principle, possible that the audio-motor mapping deficit

reflects a deficit in visual motor mapping, in keeping with the

reported visual action recognition impairment of patients with

apraxia [75].

To determine whether deficits in audio-motor and visuo-motor

action mapping reflect a common supramodal representation, or

were driven by visual deficits, results from the lesion mapping study

[37] were compared with those from neuroimaging studies in healthy

subjects where visual or auditory action recognition was required.

Distinct neural regions in the left hemisphere were identified that

were specifically related to observing or hearing hand- and mouth-

related actions. In particular, a somatotopic arrangement along the

motor strip seems to be distinctive of visual- and auditory-related

actions (Figure Ib) [31]. Thus, although multimodal perception might

optimize action mapping, an independent contribution to this process

could be provided by vision and audition. Olfactory mapping of

actions has not yet been performed in patients with brain damage.[()TD$FIG]

Hand Mouth

Body-related action sounds in fmri

Limb related actions

Overlap

Mouth-related action sounds in VLSM

Body-related action observation in fmri

Mouth related actions

(b)

Key:

Key:

(a)

101112131415161718

101112131415161718

Buccofacial apraxia patientsLimb apraxia patients Controls

Mouth sounds Limb sounds Non-human sounds

89

101112131415161718

89

101112131415161718

Mouthsounds

Limbsounds

Controlsounds

Frontal

Mouthsounds

Limbsounds

Controlsounds

Parietal

Sou

nd r

ecog

nitio

n ac

cura

cy

Mouth transitive sounds

Limb intransitive sounds Limb transitive sounds

Mouth intransitive sounds

Animal sounds Non-human sounds

Key:

TRENDS in Cognitive Sciences

Figure I. Visual and auditory action mapping in brain damaged patients. (a) Direct evidence for the anatomical and functional association between action execution and

discrimination during matching of specific visual pictures to previously presented sounds in patients with brain damage and with or without apraxia. Voxel-based lesion

symptom mapping (VLSM) analysis demonstrated a clear association between deficits in performing hand- or mouth-related actions, the ability to recognize the same

sounds acoustically, and frontal and parietal lesions [37]. (b) Cortical rendering shows the voxel clusters selectively associated with deficits (VLSM study) or activation

(fMRI studies) when processing limb- or mouth-related action sounds [31]. Data adapted, with permission, from [31,37].

Review Trends in Cognitive Sciences February 2011, Vol. 15, No. 2

input, the firing rate of a proportion of these cells washigher with respect to the sum of auditory or visual inputalone. This superadditive response occurred when the seenaction matched the heard action [44]. The STS is heavilyconnected to the frontal and parietal regions [45], thushinting at the important role of temporal structures in thesimulation of actions triggered by audiovisual inputs.

A clear multimodal contribution to action mapping wasdemonstrated in a functional magnetic resonance imaging(fMRI) study where subjects observed hand-graspingactions directed to odourant objects (e.g. a fruit, such asa strawberry or an orange) that were only smelt, only seen,or both smelt and seen [40]. Grasping directed towardsobjects perceived only through smell activated not only theolfactory cortex, but also the AON (Figure 2). Moreover,perceiving the action towards an object coded via botholfaction and vision (visuo-olfacto-motor mapping) inducedfurther increase in activity in the temporo-parietal cortex[40]. A clear increase of corticospinal motor facilitationduring the observation of unseen but smelt objects, and

50

the visual observation of the grasping of the same objects,has also been reported, further confirming the presence ofvisuo-olfacto-motor resonance [39]. It is also relevant thatthe neural activity in response to visual–olfactory action-related cues in the right middle temporal cortex and leftsuperior parietal cortex might be superadditive; that is,higher than the sum of visual and olfactory cues presentedin isolation [40]. Accordingly, although unimodal inputmight trigger action representation, congruent bimodalinput is more appropriate because it provides an enrichedsensory representation, which, ultimately, enables full-blown action simulation. In human and non-human pri-mates, the orbitofrontal cortex (OFC) receives input fromboth the primary olfactory cortex and the higher-ordervisual areas [46], making it a prominent region for themultisensory integration of olfactory and visual signals.When the integration concerns visual–olfactory represen-tations related to the simulation of a given action, theproduct of such computation has to be sent to motorregions. The OFC is heavily connected to brain regions

Page 5: Aglioti&Pazzaglia TICS Sounds Scents Social Action 2011

[()TD$FIG]

4

5

6

7

8

ME

P a

mpl

itude

(lo

g m

v)

(a)

(b)

Grasping observation

Grasping observation by smellinginteraction

1°trial0 s

Smelling

4

5

6

7

8

ME

P a

mp

litud

e (lo

g m

v)

4

5

6

7

ME

P a

mp

litud

e (lo

g m

v)

No-observatione Observation

3 s TMS

5 s

10 s

12 s

TMS

2° trial

3° trial

4° trial

17 s

19 s

24 s

26 s

TMS

TMS

3 s

Visual

Visual - olfactory

Olfactory

Grasping

3 s

10.5 s

1°trial

2° trial

No-smell Smelling

Inferior parietalcortex

Premotor dorsalcortex

Premotor ventralcortex

Inferior parietalcortex

inferior parietalcortex

Superior parietal

Middle temporalcortex

Middle parietalcortex

t va

lues

9

Left hemisphere

Right hemisphere

3.5

Premotor dorsalcortex

Superior parietalcortex

TRENDS in Cognitive Sciences

No-smell Smelling

Figure 2. The flavour of actions. The effects of unimodal and cross-modal olfactory stimulation have been investigated in subjects who smelled the odours of graspable objects

and observed a model grasping ‘odourant’ foods. Unimodal presentation consisted of either visual (see a model while reaching to grasp food) or olfactory (smelling the

graspable food with no concurrent visual stimulation) stimuli. In the bimodal presentation, visual and olfactory stimuli occurred together. (a) Single-pulse TMS was delivered to

the primary motor cortex of healthy subjects. Sniffing alimentary odourants induced an increase of corticospinal reactivity of the same muscles that would be activated during

actual grasping of the presented food. Moreover, cross-modal facilitation was observed during concurrent visual and olfactory stimulation [39]. (b) The observation of a hand

grasping an object that was smelt but not seen activated the frontal, parietal and temporal cortical regions. No such activity was found during observation of a mimed grasp.

Additive activity in this action observation network was observed when the object to be grasped was both seen and smelt [40]. Importantly, maximal modulation of corticospinal

reactivity (TMS) and of BOLD signal (fMRI) was observed when both visuo-motor and olfacto-motor information were presented. This result suggests that, although olfactory

stimuli might unimodally modulate the action system, its optimal tuning is achieved through cross-modal stimulation. Data adapted, with permission, from [39,40].

Review Trends in Cognitive Sciences February 2011, Vol. 15, No. 2

involved in movement control. In particular, direct con-nections between the OFC and the motor part of thecingulate area, the supplementary and the pre-supplemen-tary motor areas, the ventral premotor area and even theprimary motor cortex, have been described [47,48].

The possible functional gain of multimodal over unim-odal coding of actions deserves further discussion. Motorresonance does not only involve the commands associatedwith motor execution, but also a variety of sensory signalsthat trigger or modulate the action simulation process.Such modulation might be more effective when mediatedby more than one sensory modality. Indeed, multimodalintegration seems to enhance perceptual accuracy andsaliency by providing redundant cues that might help to

characterize actions fully. Importantly, multisensory inte-gration appears more effective when weak and sparsestimuli are involved [49]. Thus, it might be that multisen-sory integration in the service of action simulation providesprecise dynamic representations of complex sensoryactions. Moreover, the functional gain derived from multi-modal integration might help robust and detailed simula-tion of the perceived action.

Can the social mapping of an action occurindependently from vision or audition?Inferences about the sensory andmotor states of others canbe drawn via mental imagery that involves specific neuralsystems (e.g. the somatic or the visual cortex for tactile and

51

Page 6: Aglioti&Pazzaglia TICS Sounds Scents Social Action 2011

Review Trends in Cognitive Sciences February 2011, Vol. 15, No. 2

visual imagery, respectively) [9,10,50]. However, only thetelereceptive senses allow the social perception of touchand pain. Perceiving the touch of others, for example, canoccur only through vision [11], whereas perceiving pain inothers can occur through vision (e.g. direct observation ofneedles penetrating skin) [10,51–53], audition (e.g. hearinganother’s cry) [54], or even smell (e.g. the odour of burningflesh) [55]. Although the telereceptive senses can map theactions of others unimodally, cross-modalmapping is likelyto be the norm. However, whether vision or audition ismore dominant in modulating this process in humans isstill an open question. The study of blind or deaf individu-als provides an excellent opportunity for addressing thisissue (Figure 3). A recent fMRI study demonstrated thatthe auditory presentation of hand-executed actions incongenitally blind individuals activated the AON, al-though to a lesser extent compared with healthy, blind-folded participants [56]. However, a clear lack ofcorticospinal motor reactivity to vision and the sound of

[()TD$FIG]

Visual/auditory

-25-20-15-10-505

10152025

ME

P a

mp

litu

de

(% v

s. b

asel

ine

)

AuditoryVisual

6 s

0 s

4.1 s

4.8 s

TMS

(b)

(a)

AuditoryVisual

Scissors

Scissors

Figure 3. The auditory and visual responsiveness of the action observation network in

action systems was investigated by using fMRI while congenitally blind or sighted indivi

and executed motor pantomimes upon verbal utterance of the name of a specific tool.

task. Listening to action sounds activated a premotor-temporoparietal cortical networ

activated in the sighted individuals while they listened to an action sound and observed

in sighted individuals, suggesting that multimodal input is necessary for the optimal tu

assessed in congenitally blind (blue bars) or congenitally deaf (green bars) individuals d

(the flowing of a small stream of water in a natural environment). All videos were aura

muted sound to the deaf and hearing control individuals (grey bars = control subjects).

and wrist (FCR) muscles during action perception in the deaf versus the hearing control g

muscle-specific modulation was absent in individuals with a loss of a sensory moda

facilitation in individuals with congenital blindness or deafness suggests that the optim

permission, from [56,57].

52

actions in individuals with congenital deafness and blind-ness, respectively, was found [57]. This pattern of resultssuggests that, despite the plastic potential of the develop-ing brain, action mapping remains an inherently cross-modal process.

Anticipatory coding of the actions of others based onauditory and olfactory cuesInfluential theoretical models suggest that the humanmotor system is designed to function as an anticipationdevice [58] and that humans predict forthcoming actions byusing their own motor system as an internal forwardmodel. Action prediction implies the involvement of specif-ic forms of anticipatory, embodied simulation that triggersneural activity in perceptual [59] and motor [60] systems.Evidence in support of this notion comes from a study inwhich merely waiting to observe a forthcoming movementmade by another individual was found to trigger (uncon-sciously) a readiness potential in the motor system of an

-25-20-15-10-505

10152025

ME

P a

mp

litu

de

(% v

s. b

asel

ine

)

dPM

vPM vPM

vPM

IF IF

IF

IPL

SPLSPL

SPL

dPM

dPM

aMF IPL

IPL

MT/ST MT/ST

MT/ST

IPL

vPM

Blind DeafControls

BlindControls

-8

+8

-2.3+2.3

t sc

ore

s

Motor pantomime

Action sound

Mirror overlap

TRENDS in Cognitive Sciences

individuals with congenital blindness or deafness. (a) The modulation of resonant

duals listened and recognized hand-related action sounds or environmental sounds

The sighted individuals were also requested to perform a visual action recognition

k in the congenitally blind individuals. This network largely overlapped with that

and executed an action. Importantly, however, the activity was lower in blind than

ning of action representation systems [56]. (b) Corticospinal reactivity to TMS was

uring the aural or visual presentation of a right-hand action or a non-human action

lly presented to the blind and sighted control subjects and visually presented with

Amplitudes of the motor evoked potentials (MEPs) recorded from the thumb (OP)

roup and the blind versus the sighted control group indicated that somatotopically,

lity (either vision or hearing). The reduction of resonant audio- or visuo-motor

al tuning of the action system is necessarily multimodal [57]. Data adapted, with

Page 7: Aglioti&Pazzaglia TICS Sounds Scents Social Action 2011

Review Trends in Cognitive Sciences February 2011, Vol. 15, No. 2

onlooker [61]. In a similar vein, by using single-pulsetranscranial magnetic stimulation (TMS), it was demon-strated that the mere observation of static pictures, repre-senting implied human actions, induced body part-specific,corticospinal facilitation [62–64]. Moreover, it was alsodemonstrated that the superior perceptual ability of elitebasketball players in anticipating the fate of successfulversus unsuccessful basket throws is instantiated in atime-specific increase in corticospinal activity during theobservation of erroneous throws [65]. Although cross-mod-al perception studies indicate that auditory [17] or olfacto-ry [40] inputs might predominate over visual perception incertain circumstances, information on the phenomenologyand neural underpinnings of the anticipatory process thatenables individuals to predict upcoming actions on thebasis of auditory and olfactory information remains mea-gre (Figure 4).

Anticipation of sound sequences typically occurs whenrepeatedly listening to a music album in which differenttracks are played in the same order. Indeed, it is acommon experience that hearing the end of a given trackevokes, in total silence, the anticipatory image of thesubsequent track on the same album. Interestingly, thecreation of this association brings about an increase ofneural activity in premotor and basal ganglia regions,

[()TD$FIG]

(b)

TMS0.02 s

10 s10.5 s TMS

1° trials0 s

ME

P a

mpl

itude

Start

0

0.4

0.8

1.2

1.60.5 s

(a)

2° trials

Listen

FLICKGRASP

Sta

rtM

idd

leE

nd

Figure 4. Prospective coding of actions. (a) A single-pulse TMS study demonstrated that

significantly higher motor facilitation than does observation of final posture. Higher re

coding of observed actions is inherently anticipatory [63]. (b) An fMRI study demonstr

resonance network) and cerebellum is higher when subjects listen to a specific rhythm

passively, without expecting an action to follow. This result sheds light on the nature o

motor systems in the context of rhythm [66]. Data adapted, with permission, from [63,

suggesting that analogous predictive mechanisms areinvolved in both sound sequence and motor learning[66,67]. The prediction of an upcoming movement andthe anticipation of forthcoming actions might be evenstronger when dealing with precise sound–action orderassociation. It is relevant that hearing sounds typicallyassociated with a responsive action (e.g. a doorbell) bringsabout an increase in neural activity in the frontal regions,mainly on the left hemisphere, which is not found inresponse to sounds that do not elicit automatic motorresponses (e.g. piano notes that have not been heardbefore) [33]. Thus, social action learning triggered byauditory cues might imply the acquisition of a temporalcontingency between the perception of a particular soundand the movement associated with a subsequent action.This experience-related, top-downmodulation of auditoryperception might be used to predict and anticipate forth-coming movements and to create a representation ofevents that should occur in the near future. The graspingactions triggered by smelling fruits or sandwiches, indi-cate that olfactory cues might trigger anticipatory actionplanning [40]. Therefore, the sensory consequences of anodour are integrated and become part of the cognitiverepresentation of the related action. Unfortunately, stud-ies on the role of social odours in triggering anticipatory

Start Middle EndMiddle End

% B

old

ch

ang

e

Action--perception connectionNo action--perception connection

midPMC(54, 2, 48)

SMA(0, -6, 58)

0

0.4

0.8

1.2

0

0.4

0.8

1.2

midPMC

Key:

(-50, -6, 52)

Cerebellum(24, -72, -50)

Passive listen

Listen with anticipation

Tap Passivelisten

Listen with anticipation

Tap

4 12

with anticipation

FLICKGRASP

TRENDS in Cognitive Sciences

the observation of the start and middle phases of grasp and flick actions induces a

sonance with upcoming than with past action phases supports the notion that the

ated that neural activity in the ventral premotor cortex (which is part of the motor

in anticipation of an overt reaction to it than when they listen to the same sound

f action–perception processes and suggests an inherent link between auditory and

66].

53

Page 8: Aglioti&Pazzaglia TICS Sounds Scents Social Action 2011

Box 3. Questions for future research

� What are the developmental paths to the auditory and olfactory

mapping of actions?

� Does action anticipation occur in the complete absence of visual

mediation and what are the neural systems underpinning the

auditory anticipation of actions?

� Does subconscious perception of odours have a role in the

mapping of actions?

� What is the exact role of olfactory cues in action anticipation?

� Based on the notion that multimodal modulation is essential for

action mapping, is it possible to use different sensory inputs to make

virtual interactions more veridical? This would imply, for example,

that the presence of odors during virtual interactions with avatars,

would trigger greater embodiment of their sensorimotor states.

Review Trends in Cognitive Sciences February 2011, Vol. 15, No. 2

representations of the actions of others are currentlylacking (see Box 3).

Conclusions and future directionsHearing and smelling stimuli that evoke, or are associatedwith, actions activate their representation, thus indicatingthat not only vision, but also the other two telereceptivesenses (i.e. audition and olfaction) might trigger the socialmapping of actions somewhat independently from oneanother. Although the mapping process might be triggeredby unimodal stimulation, the action representation processelicited by auditory and olfactory cues typically occurswithin the context of multimodal perception, as indicatedby the defective resonance in blind or deaf individuals. Theresults expand current knowledge by suggesting thatcross-modal processing optimizes not only perceptual,but also motor performance. The analysis of how thesetwo sensory channels contribute to the perspective codingof the actions of others remains a fundamental topic forfuture research.

AcknowledgementsFunded by the Istituto Italiano di Tecnologia (SEED Project Prot. Num.21538), by EU Information and Communication Technologies Grant(VERE project, FP7-ICT-2009-5, Prot. Num. 257695) and the ItalianMinistry of Health.

References1 Smith, M.M. (2007) Sensing the Past: Seeing, Hearing, Smelling,

Tasting and Touching in History, University of California Press2 Driver, J. and Noesselt, T. (2008) Multisensory interplay reveals

crossmodal influences on ‘sensory-specific’ brain regions, neuralresponses, and judgments. Neuron 57, 11–23

3 Brancucci, A. et al. (2009) Asymmetries of the human social brain in thevisual, auditory and chemical modalities.Philos. Trans. R. Soc. Lond. BBiol. Sci. 364, 895–914

4 Serino, A. et al. (2009) Motor properties of peripersonal space inhumans. PLoS One 4, e6582

5 Low, K.E.Y. (2006) Presenting the self, the social body, and theolfactory: managing smells in everyday life experiences. SociologicalPerspect. 49, 607–631

6 Schutz-Bosbach, S. and Prinz, W. (2007) Perceptual resonance: action-induced modulation of perception. Trends Cogn. Sci. 11, 349–355

7 Rizzolatti, G. and Sinigaglia, C. (2010) The functional role of the parieto-frontal mirror circuit: interpretations and misinterpretations.Nat. Rev.Neurosci. 11, 264–274

8 Iacoboni, M. (2009) Imitation, empathy, and mirror neurons. Annu.Rev. Psychol. 60, 653–670

9 Avenanti, A. et al. (2007) Somatic and motor components of actionsimulation. Curr. Biol. 17, 2129–2135

54

10 Bufalari, I. et al. (2007) Empathy for pain and touch in the humansomatosensory cortex. Cereb. Cortex 17, 2553–2561

11 Keysers, C. et al. (2010) Somatosensation in social perception.Nat. Rev.Neurosci. 11, 417–428

12 Tamietto, M. and de Gelder, B. (2010) Neural bases of nonconsciousperception of emotional signals. Nat. Rev. Neurosci. 11, 697–709

13 Recanzone, G.H. (2009) Interactions of auditory and visual stimuli inspace and time. Hear. Res. 258, 89–99

14 Campbell, R. (2008) The processing of audio-visual speech: empiricaland neural bases. Philos. Trans. R. Soc. Lond. B Biol. Sci. 363, 1001–

101015 Spence, C. (2009) Explaining the Colavita visual dominance effect.

Prog. Brain Res. 176, 245–25816 Burr, D. et al. (2009) Auditory dominance over vision in the perception

of interval duration. Exp. Brain Res. 198, 49–5717 Shams, L. et al. (2000) Illusions. What you see is what you hear.Nature

408, 78818 Welch, R.B. and Warren, D.H. (1980) Immediate perceptual response

to intersensory discrepancy. Psychol. Bull. 88, 638–66719 Alais, D. and Burr, D. (2004) The ventriloquist effect results from near-

optimal bimodal integration. Curr. Biol. 14, 257–26220 Meredith, M.A. and Stein, B.E. (1983) Interactions among converging

sensory inputs in the superior colliculus. Science 221, 389–39121 Kayser, C. et al. (2010) Visual enhancement of the information

representation in auditory cortex. Curr. Biol. 20, 19–2422 Thomas, J.P. and Shiffrar, M. (2010) I can see you better if I can hear

you coming: action-consistent sounds facilitate the visual detection ofhuman gait. J. Vis. 10, 14

23 Brennan, P.A. and Kendrick, K.M. (2006) Mammalian social odours:attraction and individual recognition. Philos. Trans. R. Soc. Lond. BBiol. Sci. 361, 2061–2078

24 Prehn-Kristensen, A. et al. (2009) Induction of empathy by the smell ofanxiety. PLoS One 4, e5987

25 Lundstrom, J.N. et al. (2009) The neuronal substrates of humanolfactory based kin recognition. Hum. Brain Mapp. 30, 2571–2580

26 Walla, P. (2008) Olfaction and its dynamic influence on word and faceprocessing: cross-modal integration. Prog. Neurobiol. 84, 192–209

27 Tubaldi, F. et al. (2008) The grasping side of odours. PLoS One 3, e179528 Tubaldi, F. et al. (2008) Effects of olfactory stimuli on arm-reaching

duration. Chem. Senses 33, 433–44029 Grafton, S.T. (2009) Embodied cognition and the simulation of action to

understand others. Ann. N. Y. Acad. Sci. 1156, 97–11730 Caspers, S. et al. (2010) ALE meta-analysis of action observation and

imitation in the human brain. Neuroimage 50, 1148–116731 Aglioti, S.M. and Pazzaglia, M. (2010) Representing actions through

their sound. Exp. Brain Res. 206, 141–15132 Hauk, O. et al. (2006) The sound of actions as reflected by mismatch

negativity: rapid activation of cortical sensory-motor networks bysounds associated with finger and tongue movements. Eur. J.Neurosci. 23, 811–821

33 De Lucia, M. et al. (2009) The role of actions in auditory objectdiscrimination. Neuroimage 48, 475–485

34 Gazzola, V. et al. (2006) Empathy and the somatotopic auditory mirrorsystem in humans. Curr. Biol. 16, 1824–1829

35 Alaerts, K. et al. (2009) Interaction of sound and sight during actionperception: evidence for shared modality-dependent actionrepresentations. Neuropsychologia 47, 2593–2599

36 Galati, G. et al. (2008) A selective representation of the meaning ofactions in the auditory mirror system. Neuroimage 40, 1274–1286

37 Pazzaglia, M. et al. (2008) The sound of actions in apraxia. Curr. Biol.18, 1766–1772

38 Lundstrom, J.N. et al. (2008) Functional neuronal processing of bodyodors differs from that of similar common odors. Cereb. Cortex 18,1466–1474

39 Rossi, S. et al. (2008) Distinct olfactory cross-modal effects on thehuman motor system. PLoS One 3, e1702

40 Tubaldi, F. et al. (2010) Smelling odors, understanding actions. Soc.Neurosci. 7, 1–17

41 Chen, Y.C. and Spence, C. (2010) When hearing the bark helps toidentify the dog: semantically-congruent sounds modulate theidentification of masked pictures. Cognition 114, 389–404

42 Castiello, U. et al. (2006) Cross-modal interactions between olfactionand vision when grasping. Chem. Senses 31, 665–671

Page 9: Aglioti&Pazzaglia TICS Sounds Scents Social Action 2011

Review Trends in Cognitive Sciences February 2011, Vol. 15, No. 2

43 Kaplan, J.T. and Iacoboni, M. (2007) Multimodal action representationin human left ventral premotor cortex. Cogn. Process 8, 103–113

44 Barraclough, N.E. et al. (2005) Integration of visual and auditoryinformation by superior temporal sulcus neurons responsive to thesight of actions. J. Cogn. Neurosci. 17, 377–391

45 Allison, T. et al. (2000) Social perception from visual cues: role of theSTS region. Trends Cogn. Sci. 4, 267–278

46 Rolls, E.T. and Baylis, L.L. (1994) Gustatory, olfactory, and visualconvergence within the primate orbitofrontal cortex. J. Neurosci. 14,5437–5452

47 Cavada, C. et al. (2000) The anatomical connections of the macaquemonkey orbitofrontal cortex. A review. Cereb. Cortex 10, 220–242

48 Morecraft, R.J. and Van Hoesen, G.W. (1993) Frontal granular cortexinput to the cingulate (M3), supplementary (M2) and primary (M1)motor cortices in the rhesus monkey. J. Comp. Neurol. 337, 669–689

49 Ghazanfar, A.A. and Lemus, L. (2010) Multisensory integration: visionboosts information through suppression in auditory cortex. Curr. Biol.20, R22–R23

50 Moro, V. et al. (2008) Selective deficit of mental visual imagery withintact primary visual cortex and visual perception. Cortex 44, 109–118

51 Avenanti, A. et al. (2005) Transcranial magnetic stimulation highlightsthe sensorimotor side of empathy for pain. Nat. Neurosci. 8, 955–960

52 Betti, V. et al. (2009) Synchronous with your feelings: sensorimotor{gamma} band and empathy for pain. J. Neurosci. 29, 12384–12392

53 Minio-Paluello, I. et al. (2009)Absence of ‘sensorimotor contagion’ duringpain observation in Asperger Syndrome. Biol. Psychiatry 65, 55–62

54 Bastiaansen, J.A. et al. (2009) Evidence formirror systems in emotions.Philos. Trans. R. Soc. Lond. B Biol. Sci. 364, 2391–2404

55 Plotkin, A. et al. (2010) Sniffing enables communication andenvironmental control for the severely disabled. Proc. Natl. Acad.Sci. U.S.A. 107, 14413–14418

56 Ricciardi, E. et al. (2009) Do we really need vision? How blind people‘see’ the actions of others. J. Neurosci. 29, 9719–9724

57 Alaerts, K. et al. (2010) Action perception in individuals withcongenital blindness or deafness: how does the loss of a sensorymodality from birth affect perception-induced motor facilitation? J.Cogn. Neurosci. DOI: 10.1162/jocn.2010.21517

58 Wolpert, D.M. et al. (2003) A unifying computational framework formotor control and social interaction. Philos. Trans. R. Soc. Lond. BBiol. Sci. 2358, 593–602

59 Perrett, D.I. et al. (2009) Seeing the future: natural image sequencesproduce ‘anticipatory’ neuronal activity and bias perceptual report. Q.J. Exp. Psychol. 62, 2081–2104

60 Stanley, J. and Miall, R.C. (2009) Using predictive motor controlprocesses in a cognitive task: behavioral and neuroanatomicalperspectives. Adv. Exp. Med. Biol. 629, 337–354

61 Kilner, J.M. et al. (2004) Motor activation prior to observation of apredicted movement. Nat. Neurosci. 7, 1299–1301

62 Urgesi, C. et al. (2006) Motor facilitation during action observation:topographic mapping of the target muscle and influence of theonlooker’s posture. Eur. J. Neurosci. 23, 2522–2530

63 Urgesi, C. et al. (2010) Simulating the future of actions in the humancorticospinal system. Cereb. Cortex 20, 2511–2521

64 Candidi, M. et al. (2010) Competing mechanisms for mapping action-related categorical knowledge and observed actions. Cereb. Cortex 20,2832–2841

65 Aglioti, S.M. et al. (2008) Action anticipation and motor resonance inelite basketball players. Nat. Neurosci. 11, 1109–1116

66 Chen, J.L. et al. (2008) Listening to musical rhythms recruits motorregions of the brain. Cereb. Cortex 18, 2844–2854

67 Leaver, A.M. et al. (2009) Brain activation during anticipation of soundsequences. J. Neurosci. 29, 2477–2485

68 di Pellegrino, G. et al. (1992) Understanding motor events: aneurophysiological study. Exp. Brain Res. 91, 176–180

69 Gallese, V. et al. (1992) Action recognition in the premotor cortex.Brain119, 593–609

70 Umilta, M.A. et al. (2001) I know what you are doing. Aneurophysiological study. Neuron 31, 155–165

71 Fogassi, L. et al. (2005) Parietal lobe: from action organization tointention understanding. Science 308, 662–667

72 Kohler, E. et al. (2002) Hearing sounds, understanding actions: actionrepresentation in mirror neurons. Science 297, 846–848

73 Keysers, C. et al. (2003) Audiovisual mirror neurons and actionrecognition. Exp. Brain Res. 153, 628–636

74 Mukamel, R. et al. (2010) Single-neuron responses in humans duringexecution and observation of actions. Curr. Biol. DOI: 10.1016/j.cub.2010.02.045

75 Pazzaglia, M. et al. (2008) Neural underpinnings of gesturediscrimination in patients with limb apraxia. J. Neurosci. 28, 3030–

3041

55