hands up who wants to talk

2
This week “WORDS would seem to have been necessary to establish the use of words.” Philosopher Jean- Jacques Rousseau pithily summed up the paradox underlying the evolution of language some three centuries ago. So, how did words arise without the words to explain them? Biologists have long assumed that human language evolved from the basic vocalisations made by chimps and other primates, but this doesn’t help resolve Rousseau’s paradox. Now discoveries in chimps and other non-human primates suggest a solution: spoken language evolved from gesture. “The idea is that our ancestors started out using hand gestures, and only later moved to speech,” says Frans de Waal of the Yerkes National Primate Research Center in Atlanta, Georgia. “Gestures appear first in human development, before speech, and babies can learn to use them to communicate faster.” As signals, gestures are evolutionarily more recent than vocalisations and facial expressions – apes use them, but monkeys don’t. If this gestural hypothesis of how language evolved is correct, then like words, the meaning of a gesture should depend on the context in which it is used, and on what other signals are being given at the same time. Now Amy Pollick and de Waal have tested the idea by looking at how strongly gesture and vocal signals are tied to context in our closest primate relatives, chimps and bonobos. Pollick and de Waal observed captive groups of bonobos and chimps and identified 31 gestures – defined as any movement of the forearm, hand, wrist or fingers used solely for communication – as well as KENNETH W. FINK/ARDEA This week International news and exclusives ROWAN HOOPER 6 | NewScientist | 5 May 2007 www.newscientist.com The way bonobos and chimpanzees communicate suggests that language evolved from gestures, not vocalisations Hands up who wants to talk “The idea is that our ancestors started out using hand gestures, and only later moved to speech”

Upload: rowan

Post on 27-Dec-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Hands up who wants to talk

This week–

“WORDS would seem to have

been necessary to establish the

use of words.” Philosopher Jean-

Jacques Rousseau pithily summed

up the paradox underlying the

evolution of language some

three centuries ago. So, how did

words arise without the words to

explain them?

Biologists have long assumed

that human language evolved

from the basic vocalisations made

by chimps and other primates,

but this doesn’t help resolve

Rousseau’s paradox. Now

discoveries in chimps and other

non-human primates suggest a

solution: spoken language

evolved from gesture.

“The idea is that our ancestors

started out using hand gestures,

and only later moved to speech,”

says Frans de Waal of the Yerkes

National Primate Research

Center in Atlanta, Georgia.

“Gestures appear first in human

development, before speech, and

babies can learn to use them to

communicate faster.” As signals,

gestures are evolutionarily more

recent than vocalisations and

facial expressions – apes use

them, but monkeys don’t.

If this gestural hypothesis of

how language evolved is correct,

then like words, the meaning of

a gesture should depend on the

context in which it is used, and on

what other signals are being given

at the same time. Now Amy Pollick

and de Waal have tested the idea

by looking at how strongly

gesture and vocal signals are tied

to context in our closest primate

relatives, chimps and bonobos.

Pollick and de Waal observed

captive groups of bonobos

and chimps and identified

31 gestures – defined as any

movement of the forearm, hand,

wrist or fingers used solely for

communication – as well as KEN

NET

H W

. FI

NK

/AR

DEA

This week–

International news and exclusives

ROWAN HOOPER

6 | NewScientist | 5 May 2007 www.newscientist.com

The way bonobos and chimpanzees communicate suggests that language evolved from gestures, not vocalisations

Hands up who wants to talk

“The idea is that our

ancestors started out using

hand gestures, and only

later moved to speech”

070505_N_p6_7_Gesture.indd 6070505_N_p6_7_Gesture.indd 6 1/5/07 4:58:42 pm1/5/07 4:58:42 pm

Page 2: Hands up who wants to talk

18 facial or vocal signals, and

recorded the context in which

they were used. They found that

the facial and vocal signals had

practically the same meaning in

both species, but the same gesture

was used in different contexts

both between and within species

(Proceedings of the National

Academy of Sciences, DOI:

10.1073/pnas.0702624104).

For example, the vocal signal

“bared-teeth scream” signals fear

in chimps and bonobos, but the

gesture “reach out up”, where an

animal stretches out an arm,

palm upwards, has different

meanings. It may be begging for

food, in the same way people beg

for food or money, or it may be

begging for support from a friend,

says de Waal. “The open-hand

gesture is also used after fights

between two individuals to beg

for approach and contact in a

reconciliation. So the gesture is

versatile, but the meaning

depends on context.”

Humans often gesture in

combination with speech, and

chimps and bonobos also use

such “multimodal” signals:

making a gesture and vocalising

at the same time. Pollick and

de Waal found that multimodal

signals were more likely to elicit

a response in bonobos than in

chimps. “It seems to fit other

indicators that bonobos have a

more complex integration of

signals, so that gestures do not

just emphasise the meaning of

other signals, but perhaps

transform them,” says de Waal.

The line that led to bonobos

and chimps split from the one

that led to Homo sapiens around

6 million years ago, and bonobos

and chimps parted about

2.5 million years ago. However,

genetic studies suggest we are

slightly closer to bonobos than we

are to chimps, and de Waal says

their work suggests bonobos are a

better species for understanding

the evolution of language. For one

thing, they seem be further away

from the common ancestor, at

least in terms of complex

communication, than chimps.

For instance, bonobos engage

in vocalisations that resemble

dialogue, such as males alternating

screams at each other during

confrontations. Pollick and

de Waal also found that gestures

vary from group to group in

bonobos, and are more effective

when used with other signals,

something not seen in chimps.

The work offers strong support

for the gesture hypothesis of

language evolution, says Michael

Corballis, a psychologist at the

University of Auckland, New

Zealand. “It implies that manual

gestures are freer of context

than vocal ones, and can therefore

be adapted to language. To put

it another way, manual gestures

can be controlled voluntarily,

whereas in non-human primates

vocalisations are largely

involuntary and limited to

emotional situations.”

But Rafael Nunez, a cognitive

scientist at the University of

California, San Diego, cautions

against drawing firm conclusions

about the evolution of language

from studies of captive chimps

and bonobos, given their capacity

for imitation. Some of their

gestures may be “contaminated”

by interactions with humans, he

says. “One must study these

species primarily in the wild.”

Another potential problem is

that gestures don’t have syntax

or grammar, says Kazuo Okanoya

of the RIKEN Brain Science

Institute in Wako, Japan. Also,

while bonobos use multimodal

signals, speech occurs in a single

mode. “To strengthen the gesture

hypothesis of language we need

to know how multimodal signals

changed into monomodal

signals, and how the gestural

proto-language shifted to the

vocal domain,” Okanoya says.

Corballis thinks he has an

answer. Speech itself is best

considered a gestural rather than

an acoustic system, he says. It’s

just that the “gestures” are made

by the tongue, larynx and lips.

“My guess is that gestural

language became more facial and

less manual as our ancestors

usurped the hands for other

activities such as carrying,

manufacturing and tool use.”

So how did gestures withdraw

into the mouth and throat? The

prime candidate is a mutation in

the FOXP2 gene on chromosome 7.

The human form of the gene is

linked to fine motor control and

language comprehension, and

differs from that of chimps by

In this section

● Plants don’t emit methane after all, page 8

● Scar-free surgery through the mouth, page 12

● “Forgotten” renewable energy is back, page 16

just two codons. This means

only two of the 715 amino acids

that make up the FOXP2 protein

are different in humans and

chimps. “A common idea is that

this mutation may have been

critical to the evolution of

language, but my belief is that it

was critical to the evolution of

speech,” says Corballis.

Bernard Crespi of Simon Fraser

University in Burnaby, Canada,

agrees (Trends in Ecology and

Evolution, vol 22, p 175). “The thing

about FOXP2 is that it is really an

articulation gene more than a

‘language’ gene,” he says. “It gave

humans much finer control over

the many muscles involved in

making sounds.”

The gesture hypothesis needs

more research on the neurological

connections between gesture

and language (see “Linked in the

Brain”). Sequencing FOXP2 in

Neanderthals will also help, by

enabling geneticists to put a more

accurate date on when the critical

mutations took place. Recent

estimates have the last of the two

mutations in the FOXP2 gene

occurring within the past

200,000 years, after we diverged

from the Neanderthals. The

Neanderthal genome is “in the

works”, says Crespi, and, once

sequenced, should tell us when

the first mutation occurred. ●

www.newscientist.com 5 May 2007 | NewScientist | 7

“Speech itself is best considered a gestural rather than an acoustic system. It’s just that the ‘gestures’ are made by the mouth”

LINKED IN THE BRAINMonkeys mainly use vocalisations to

communicate, whereas non-human

apes use combinations of vocalisations

and gestures, and the meaning of their

signals is more dependent on context

than in monkeys. The next step, seen in

humans, is the refinement of these

combinations and the contexts they

occur in. The result is speech. “Gesture

forms a neural scaffold for language,”

says Bernard Crespi of Simon Fraser

University in Burnaby, Canada.

To investigate this further, Roel

Willems of the F. C. Donders Centre for

Cognitive Neuroimaging in Nijmegen,

the Netherlands, and colleagues used

functional MRI to scan the brains of

volunteers while they listened to

sentences and watched gestures that

either matched or mismatched the

sentences. They found that regions in

the left inferior frontal cortex were

more strongly activated during

mismatches (Cerebral Cortex, DOI:

10.1093/cercor/bhl141).

“This suggests that meaningful

information which is partially conveyed

through a gesture also relies on

areas implicated in understanding

a spoken word in a context,” says

Willems. In other words, some parts of

the brain – notably Brodmann’s area

and Broca’s area (see Illustration) – are

strongly involved in processing both

speech and gesture.

Both locations are also thought to

contain mirror neurons, which fire both

when an action is performed and when

the same action is observed being

performed by another. “There is evidence

for a tight link in the brain between

language and action, and between

language and gestures,” says Willems.

BROCA’S AREA BRODMANN’S AREA

070505_N_p6_7_Gesture.indd 7070505_N_p6_7_Gesture.indd 7 1/5/07 4:58:58 pm1/5/07 4:58:58 pm