eecs 690 april 11. another “special property” consciousness, awareness, sentience,...

9
EECS 690 April 11

Upload: juniper-wilkins

Post on 01-Jan-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: EECS 690 April 11. Another “Special Property” Consciousness, Awareness, Sentience, Understanding, are all words used to vaguely describe various phenomena

EECS 690

April 11

Page 2: EECS 690 April 11. Another “Special Property” Consciousness, Awareness, Sentience, Understanding, are all words used to vaguely describe various phenomena

Another “Special Property”

• Consciousness, Awareness, Sentience, Understanding, are all words used to vaguely describe various phenomena of mental life.

• Searle is the most prominent among many who advance arguments to show that (ro)bots do not and cannot have some or all of the above properties.

Page 3: EECS 690 April 11. Another “Special Property” Consciousness, Awareness, Sentience, Understanding, are all words used to vaguely describe various phenomena

The Chinese Room

• The background of the debate: Alan Turing wrote a paper in 1950 that proposed a blind conversational test as the arbiter of the question “Can a machine think?” His test has since been known as a Turing Test. Searle’s argument is supposed to be a response directly to the Turing Test’s acceptability for its stated purpose.

Page 4: EECS 690 April 11. Another “Special Property” Consciousness, Awareness, Sentience, Understanding, are all words used to vaguely describe various phenomena

The Chinese Room

Page 5: EECS 690 April 11. Another “Special Property” Consciousness, Awareness, Sentience, Understanding, are all words used to vaguely describe various phenomena

The Chinese Room

• The Upshot: Searle believes that he is attacking these two assumptions that Strong AI makes:– The mind is an abstract thing that is not

necessarily identical with the human brain. Something could have a mind without having a human brain.

– The Turing Test is an adequate criterion for determining what has a mind.

Page 6: EECS 690 April 11. Another “Special Property” Consciousness, Awareness, Sentience, Understanding, are all words used to vaguely describe various phenomena

The Chinese Room

• The assumption behind the upshot: Searle means to claim that understanding, thinking, etc. requires a system with a semantics, not only a syntax. He argues that since digital computers have only a syntax, they are ipso facto non-thinking, non-understanding, etc. no matter how they behave.

Page 7: EECS 690 April 11. Another “Special Property” Consciousness, Awareness, Sentience, Understanding, are all words used to vaguely describe various phenomena

Other Thought Experiments

• Consider that since you could make a digital computer out of a sack of marbles and a roll of toilet paper, you could make a functional “brain” out of the following: – Beer cans and beer– Macaroni pieces and water– The population of the Americas (Ned Block’s example)

• Reply (due to Bill Lycan): Shrink yourself down to the size of a largish organic molecule and wander around John Searle’s brain. You might see things that look like basketballs whizzing around and banging into each other. Would anything you see lead you to believe you were looking at the operation of a conscious mind?

Page 8: EECS 690 April 11. Another “Special Property” Consciousness, Awareness, Sentience, Understanding, are all words used to vaguely describe various phenomena

Replies to the Chinese Room1. The Systems Reply: Searle illegitimately focuses on the person in

the box as lacking understanding of Chinese, while the system as a whole obviously understands Chinese.

2. The Brain Simulator Reply: Searle illegitimately claims that a simulation of mental activity isn’t mental activity. Certainly a simulated hurricane does no real damage, but on the other hand, a computer that controls an auto factory really makes cars.

3. The Other Minds Reply: Searle is in no position to say that Chinese speakers that he meets really understand their language.

4. The Intuition Reply: Ned Block says: “Searle's argument depends for its force on intuitions that certain entities do not think. But, (1) intuitions sometimes can and should be trumped and (2) perhaps we need to bring our concept of understanding in line with a reality in which certain computer robots belong to the same natural kind as humans.”

Page 9: EECS 690 April 11. Another “Special Property” Consciousness, Awareness, Sentience, Understanding, are all words used to vaguely describe various phenomena

What then happens to our mentalistic language?

• It may be preserved and incrementally refined, but not eliminated as the vocabulary of a special science of the mind (i.e. psychology)

• It may be eliminated as outdated language (as we no longer talk about demons and wandering uteri) and replaced (perhaps by the vocabulary of neuroscience).

• Alternately, it retains its use though as not a fine-grained scientific theory. (This is the rationale of Dennett’s “The Intentional Stance”).