The Chinese room experience: computers with the mind?

The mental experience of the Chinese room is a hypothetical situation posed by American philosopher John Searle, to show that the ability to orderly manipulate a set of symbols does not necessarily imply that there is a linguistic understanding or understanding of those symbols. In other words, the ability to understand does not arise from syntax, with which the computational paradigm developed by cognitive science to understand the workings of the human mind is called into question.

In this article, we will see what exactly this mental experience consists of and what kind of philosophical debates it has generated.

    The Turing machine and the computing paradigm

    The development of artificial intelligence is one of the great attempts of the twentieth century to understand and even reproduce the human mind through the use of computer programs. In this context, one of the most popular models was the Turing machine.

    Alan Turing (1912-1954) wanted to show that a programmed machine can hold conversations like a human being. Therefore, he proposed a hypothetical situation based on imitation: if we program a machine to mimic the linguistic ability of speakers, then we put it in front of a set of judges, and make 30% of those judges believe that they speak to a human visitor, that would be sufficient evidence to show that a machine can be programmed to mimic the mental states of human beings; and vice versa, it would also be an explanatory model of the functioning of human mental states.

    From the computational paradigm, part of the cognitive stream suggests that the most effective way to gain knowledge about the world is the increasingly refined reproduction of information processing rules, So that regardless of subjectivity or individual history, we could function and respond in society. Thus, the mind would be an exact copy of reality, it is the place of knowledge par excellence and the tool to represent the outside world.

    After the same Turing machine some computer systems have been programmed to try to pass the test. One of the first was ELIZA, designed by Joseph Weizenbaum, who responded to users using a template previously saved in a database, which led some callers to believe they were talking to a person.

    Some of the more recent inventions that resemble the Turing machine include, for example, CAPTCHA for spam detection, or SIRI of the iOS operating system. But, just as there have been those who tried to prove Turing right, there have also been those who questioned him.

      The Chinese room: does the mind work like a computer?

      From experiments seeking to pass the Turing test, John Searle distinguishes between loose artificial intelligence (which simulates understanding but without intentional states, i.e. describes the mind but does not equal it. not); and strong artificial intelligence (when the machine has mental states like humans, for example, if it can understand stories like a person).

      For Searle, it is impossible to create a strong artificial intelligence, Which he wanted to verify using a mental experiment known as a Chinese room or a Chinese room. This experiment involves posing a hypothetical situation which is as follows: an English speaker, who does not know Chinese, is locked in a room and has to answer questions about a story told to him in Chinese.

      How do you respond to them? through a rulebook written in English that is used to syntactically order Chinese symbols without explaining their meaning, but simply explaining how they are to be used. Through this exercise, the person inside the room responds appropriately to the questions, even if that person does not understand their content.

      Now suppose there is an outside observer, what do you see? Let the person inside the room behave exactly like a person who understands Chinese.

      For Searle, this shows that a computer program can mimic a human mind, but it doesn’t mean that a computer program is equal to a human mind, because it has neither semantic capacity nor intentionality.

      Impact on understanding the human mind

      Taken in the realm of humans, the above means that the process by which we develop the ability to understand a language goes beyond having a set of symbols; other items are needed that computer programs cannot have.

      Not only that but, from this experience studies on the construction of meaning have been extended, And where is that meaning. The propositions are very diverse, ranging from cognitivist perspectives that say it’s in every person’s head, derived from a set of mental states or that are innately given, to more constructionist perspectives that ask how systems are socially constructed rules and practices that are historical and make social sense (that a term has meaning not because it is in people’s minds, but because it fits into a set of practical rules language).

      Criticisms of the Chinese Hall Mental Experience

      Some researchers who disagree with Searle think the experiment is invalid because, even if the person inside the room does not understand Chinese, it may be that in conjunction with the surrounding elements (the room itself, the real estate, the rulebook) there is an understanding of Chinese.

      Faced with this, Searle responds with a new hypothetical situation: even if we remove the elements surrounding the person inside the room, and ask him to memorize the rule books for manipulating Chinese symbols, that person would not understand the Chinese, which, do a computer processor.

      The response to this same criticism has been that the Chinese room is a technically impossible experiment. In turn, the answer to this has been what is technically impossible. does not mean that it is logically impossible.

      Another of the most popular critiques has been made by Dennett and Hofstadter, which apply not only to Searle’s experience but to the body of mental experiences that have been developed over the past centuries, as reliability is questionable. . empirical reality, but speculative and close to common sense, with which, they are above all a “bomb of intuitions”.

      Bibliographical references:

      • González, R. (2012). The China Piece: A Mental Experiment with Cartesian Bias ?. Chilean Journal of Neuropsychology, 7 (1): 1-6.
      • Sandoval, J. (2004). Representation, discursiveness and situated action. Critical introduction to the social psychology of knowledge. University of Valparaiso: Chile.
      • González, R. (S / A). “Bomb of intuition”, spirit, materialism and dualism: verification, refutation or time ?. Deposit of the University of Chile. [En línea]. Accessed April 20, 2018.Available at http://repositorio.uchile.cl/bitstream/handle/2250/143628/Bombas%20de%20intuiciones.pdf?sequence=1.

      Leave a Comment