ben yami cr

Upload: andrew-barron

Post on 08-Apr-2018

225 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/7/2019 ben yami CR

    1/5

    A Note on the Chinese RoomAuthor(s): Hannoch Ben-YamiSource: Synthese, Vol. 95, No. 2 (May, 1993), pp. 169-172Published by: SpringerStable URL: http://www.jstor.org/stable/20117775 .

    Accessed: 13/04/2011 14:33

    Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at .http://www.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unlessyou have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you

    may use content in the JSTOR archive only for your personal, non-commercial use.

    Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at .http://www.jstor.org/action/showPublisher?publisherCode=springer. .

    Each copy of any part of a JSTOR transmission must contain the same copyright notice that appears on the screen or printed

    page of such transmission.

    JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of

    content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms

    of scholarship. For more information about JSTOR, please contact [email protected].

    Springeris collaborating with JSTOR to digitize, preserve and extend access to Synthese.

    http://www.jstor.org

    http://www.jstor.org/action/showPublisher?publisherCode=springerhttp://www.jstor.org/stable/20117775?origin=JSTOR-pdfhttp://www.jstor.org/page/info/about/policies/terms.jsphttp://www.jstor.org/action/showPublisher?publisherCode=springerhttp://www.jstor.org/action/showPublisher?publisherCode=springerhttp://www.jstor.org/page/info/about/policies/terms.jsphttp://www.jstor.org/stable/20117775?origin=JSTOR-pdfhttp://www.jstor.org/action/showPublisher?publisherCode=springer
  • 8/7/2019 ben yami CR

    2/5

    HANNOCH BEN-YAMI

    A NOTE ON THE CHINESE ROOM

    ABSTRACT. Searle's Chinese Room was supposed to prove that computers can't understand: the man in the room, following, like a computer, syntactical rules alone, thoughindistinguishable from a genuine Chinese speaker, doesn't understand a word. But sucha room is impossible: the man won't be able to respond correctly to questions like 'Whatis the time?', even though such an ability is indispensable for a genuine Chinese speaker.Several ways to provide the room with the required ability are considered, and it isconcluded that for each of these the room will have understanding. Hence, Searle'sargument is invalid.

    Searle, in his description of the Chinese room, asks us to imagine aman who is locked in a room with a book containing syntactical rulesof Chinese written in English. Assisted by these rules, theman can

    respond so well to questions written in Chinese that are passed intothe room that, though he doesn't understand Chinese, it is impossiblefor someone outside the room to discern between the answers comingout of the room and those that would have been given by a genuineChinese speaker. The man inside the room, answering perfectly inChinese despite his lack of understanding it, is an analogy to the programmed computer. Searle thus concludes that syntax alone cannotendow a computer with understanding.1To the best of my knowledge, the objections to Searle's analogy didnot oppose the theoretical possibility of the Chinese room.2 I want toargue that the Chinese room described above is impossible even intheory, since the man cannot simultaneously be limited to the syntactical level and have the competence of a genuine Chinese speaker, and,therefore, the analogy is unacceptable. I think many found Searle'sclaim persuasive, or at least disturbing, because of the apparent theoretical possibility of the Chinese room; showing why it is not possible will,therefore, undermine Searle's position.Consider the following question written in Chinese and passed intothe room: 'What is the time?'. The man inside the room won't be ableto answer this or like questions consistently if he relies only on syntactical rules, as he is supposed to do in Searle's suggestion. If the man is

    Synthese 95: 169-172, 1993.? 1993 Kluwer Academic Publishers. Printed in the Netherlands.

  • 8/7/2019 ben yami CR

    3/5

    170 HANNOCH BEN-YAMIinstructed to give an arbitrary answer to such questions, then he won'tbe able to answer consistently to this question when it is, say, onceagain passed into the room half an hour later. If, on the other hand,he is instructed to answer: T don't know; I don't have a watch', thenhe won't have the competence of a genuine Chinese speaker. That isbecause the above example is only one of many. We can pass a coloredcard into the room and ask what its color is, play two notes on a pianoand ask which is higher, and so on: to all these he won't be able toanswer consistently. The man in the room, relying only on syntacticalrules, will be a blind, deaf, etc., Chinese; i.e., a Chinese bereft of hissenses.3

    There are only two alternatives open to us, if we want to maintainthe competence of a genuine Chinese speaker. The first is to includeamong the rules (i.e., the program) instructions of this kind: 'If youare given the sentence . . . (here the Chinese question corresponding tothe English 'What is the time?' should be written) look at a watch and

    write a sentence according to these rules: . . .'. In this alternative, theman responds correctly by himself. Obviously, he will gradually startto learn Chinese and will begin understanding the sentences passed intothe room. But even more important than that, his ability to answer thequestions relies on his understanding of the concepts of time, color (in

    my second example), etc. If he is analogous to the programmed computer, then we have to conclude that the computer has understanding.The second possible way to enable the man in the room to answerthe question successfully is not to make him respond by himself, butto instruct him to pass the question on, possibly after some syntacticalprocessing, to someone or something that will supply him with thecorrect answer; and the man in the room will only have to pass it onas the output, maybe after more syntactical operations. In this case,the man in the room remains in the syntactical level, while whateversupplies him with the answer has to refer to a watch. In such a case,the existence of understanding depends on what we include in our view.If we only consider the man in the room, then we may claim that hedoesn't understand Chinese, but then neither is he equivalent to agenuine Chinese speaker nor, therefore, is he equivalent to a computerwhich is equivalent to such a speaker; hence, we cannot conclude fromthe man's lack of understanding that the computer will lack understanding. If, on the other hand, we look at the room combined withwhat is passing the answers into it, then we do have in this union the

  • 8/7/2019 ben yami CR

    4/5

    A NOTE ON THE CHINESE ROOM 171equivalence to a genuine Chinese speaker. But since an important partof the processing is not done by the man in the room, we cannot againconclude from his lack of understanding that a computer will lackunderstanding. On the contrary, since the ability of the system as awhole to process data is equivalent to the ability of a genuine Chinesespeaker, it does seem legitimate to attribute understanding to it, sothat if the analogy between the system and the computer holds, thenthe computer has understanding.4In conclusion, we can say that Searle was right in arguing that meresyntactical rules won't suffice for understanding, but wrong in assumingthat a computer, successfully imitating our linguistic abilities, can belimited solely to syntactical rules (the Chinese room). Such a computerwould have to refer to things in the world, be it to a watch fixed insideit or to perceptual information via various instruments, in order toachieve the required competence; that is, the computer will have asemantics as well as syntax, and therefore will have understanding."Computer programs are formal (syntactic)" (Searle 1990, p. 27), that'sfor sure; but the computer in which they are actualized makes contactwith the surrounding world in a certain way, and in this way meaningcomes into being - just as our language acquires meaning: through theuse we make of it in our contact with the world. Indeed, the machineswe build inAI do refer to the world surrounding them, and thereforedo have understanding.

    NOTES1 Searle wrote about the Chinese room in several places. I'm relying here on the description given in his 'Minds, Brains, and Programs' (1980; including open peer commentaryand author's responses), on Chapter 2 of his Minds, Brains and Science (1984), and onhis article 'Is the Brain's Mind a Computer Program?' (1990).2 Twenty-seven commentaries were appended to Searle's essay in Behavioral and BrainSciences. For more recent criticisms, see Dennett (1987) and Paul and Patricia Churchland(1990). Dennett and the Churchlands claim that the man is neither able to work fastenough nor to follow a sufficiently complex program in order to imitate successfully agenuine Chinese speaker, and therefore he doesn't understand Chinese. It follows thataccording to them syntax is sufficient for language-understanding: the only obstacles arethe speed of processing and the complexity of the program. See also Rapaport (1988, p.88), who claims "that being a purely syntactic entity is sufficient for understanding naturallanguage". I think Searle is right in accepting the contrary as one of his axioms: "Syntax

    by itself is neither constitutive nor sufficient for semantics" (Searle 1990, p. 27).Examples similar to mine are forwarded by Dennett (1980, p. 429: "pass the salt,

  • 8/7/2019 ben yami CR

    5/5

    172 HANNOCH BEN-YAMI

    please") and by Maloney (1987, p. 351: "prepare tea"). The difference between theexamples is that theirs require a nonlinguistic action, while mine requires an answer. Ithink Maloney is right to maintain that "the ability to affect the environment is not a

    universally necessary condition of language comprehension in naturally fluent linguisticagents" (1987, p. 352). My examples attempt to show that the man in the room cannotbe a 'fluent linguistic agent'.4 The closest objection to mine that I could find is the one Searle called "the RobotReply" (Searle 1980, p. 420). This is his formulation of the objection:

    Computers would have semantics and not just syntax if their inputs and outputs wereput in appropriate causal relation to the rest of the world. Imagine that we put thecomputer into a robot, attached television cameras to the robot's head, installedtransducers connecting the television messages to the computer and had the computeroutput operate the robot's arms and legs. Then the whole system would have asemantics. (Searle 1990, p. 30)

    The difference between this objection and mine is that this one takes the contact withthe world as a contingent addition to the Chinese room, one it can function without,

    while I think it is a necessary addition in order to make the room function at all: withoutit, we won't have the required competence. I have cited the objection in full, in orderto enable readers to judge for themselves the difference between the two. Searle answersthis objection in Searle (1980, p. 420).

    REFERENCES

    Churchland, P. M. and P. S. Churchland: 1990, 'Could a Machine Think?', ScientificAmerican 262, 32-37.

    Dennett, Daniel: 1980, 'The Milk of Human Intentionality', The Behavioral and BrainSciences 3, 428-30.

    Dennett, Daniel: 1987, 'Fast Thinking', in D. Dennett, The Intentional Stance, MITPress, Cambridge, Massachusetts, pp. 323-37.

    Maloney, J. Christopher: 1987, 'The Right Stuff, Synthese 70, 349-72.Rapaport, William J.: 1988, 'Syntactic Semantics: Foundations of Computational Natural

    Language Understanding', in James H. Fetzer (ed.), Aspects of Artificial Intelligence,Kluwer Academic Publishers, Dordrecht, pp. 81-131.Searle, John: 1980, 'Minds, Brains, and Programs', The Behavioral and Brain Sciences

    3, 417-57 (including open peer commentary and author's responses).Searle, John: 1984, Minds, Brains and Science, Harvard University Press, Cambridge,

    Massachusetts.Searle, John: 1990, 'Is the Brain's Mind a Computer Program', Scientific American 262,

    25-31.

    Department of PhilosophyTel-Aviv UniversityTel-Aviv, 69978Israel