The influential philosophy professor, John Searle, created the Chinese Room Argument in 1980. It was written to demonstrate a simple point –intelligent behavior does not equate to intelligence. This doesn’t mean A.I. design is impossible, but that a behavioral-based model for intelligence is flawed.
Deep Blue beat Kasparov. Your calculator can outshine Daniel Tammet. Did Deep Blue understand the game of chess? Does your calculator understand mathematics? Searle’s brilliant scenario illustrates the difference between intelligence and intelligent behavior.
This argument involves the following:
A Room, which contains:
- A small slot in a wall that’s large enough to pass pieces of paper through
- An English speaking individual sitting at a desk, who has a massive supply of pencils, erasers and scratch paper
A Chinese speaking individual standing outside of the room. She has two pieces of paper:
- A story written in Chinese
- Questions about the story, which are all written in Chinese
Going forward, the English speaking man will be named Mark and the Chinese woman Anne.
Anne slips the Chinese written story through the slot. Mark receives it, and sees a bunch of characters on it that he can’t understand, but what seems to him like Chinese. He doesn’t understand or speak Chinese, but opens the gigantic book and finds instructions written in English, explaining how to manipulate, sort and compare the Chinese characters. Nothing about the meanings of these characters is written. Only instructions are given: ways to copy, erase and rewrite the characters.
Mark gets to work, and follows the instructions exactly as they are provided in the book. Sometimes he’s told to write characters on paper, and other times to move and erase characters. Following the instructions without a single original thought, he finally comes to a point where the instruction book tells him to pass his paper back through the little slot. Mark doesn’t even know it, but he successfully produced answers to all of the questions on the paper – solely by following the rules given in the book.
Outside, Anne receives the paper. She reads the answers and confirms their accuracy. Now, you walk into the outside area and strike up a conversation with her.
You: Anne, do you think those answers come from an intelligent mind? Did the person inside understand the story?
Anne: Of course. Actually, the answers were quite perceptive.
And therein lays the problem. Who understood the story? All that occurred was Mark’s sheep-like obedience of instructions from a book written by someone else.
So what does the Chinese Room represent?
- Mark – the CPU, executing instructions without thought
- Giant book – software platform giving instructions to the CPU
- Scratch paper – the computer’s memory
What’s the Point?
It doesn’t matter how perfectly a computer is designed to simulate the intelligence of a human being -because its behavior is a result of aimlessly executing instructions, not understanding. In this case, the means defines the end. You’re reading this sentence, and understanding it without demonstrating behavior of any kind. A system’s behavior doesn’t indicate intelligence or understanding, and a system that behaves intelligently is not necessarily “intelligent.”
cog·ni·zance (noun): awareness, realization, or knowledge;