I think I must have first heard about John Searle's Chinese Room thought experiment about a year ago, when I first decided to begin doing readings on philosophy of mind on my own. The idea is this: a guy, who doesn't speak Chinese, is in a room with a rule book on processing Chinese symbols that is equivalent to a computer program of a Chinese-speaking computer program. The man successfully uses this to make it appear tp people on the outside, who send and receive Chinese communication through slots, that they're conversing with a Chinese speaker (the man works really fast). Does he understand Chinese? No, says Searle. Therefore, understanding has to be about more than running a computer program.
This is one of those things that (cliche alert) sounds great until you think about it for a minute. I'm somewhat embarrassed to say that I failed to see what's really wrong with it, until I got Searle's own account (which he doesn't ultimately buy) in his original article. He gives four different possible objections to his position, allegedly based on conversations with people in different places: the systems reply, the robot reply, the brain simulation reply, and the combination reply. Each of these is accompanied by what Searle thinks is wrong with it.
For me, the whole thing fell apart at the systems reply. The objection there is that while the man in the room doesn't understand Chinese, the room as a whole does. Searle's counter move is to imagine a man who's internalized the rules, who still, he claims, wouldn't understand Chinese. From there, the next obvious step is to insist that he be able to incorporate sensory information and respond non-verbally. Searle misses this suggestion, even when he gets to combinations of previous replies. In that case Searle could stipulate that since the man is just following rules that seem arbitrary to him, he can't integrate the Chinese stuff with his English-language knowledge, so he doesn't understand it. However, even under such a stipulation, it's not clear what on Earth is happening. It's as plausible as anything to say that the man suffers from a bizarre split-mind mental disorder, and the Chinese half of his brain does in fact understand what's going on.
I was tempted to write this up for my term paper, but then I realized that Searle's had his handed to him at least a half-dozen different ways on this issues, and suddenly the idea didn't sound like so much fun anymore. It's another personal letdown for me--Searle's refusal to be described as a materialist or dualist briefly sounded very good to me when I was stewing over unclear definitions of the terms, but here it's clear to me that Searle's been making a name for himself on some really bad philosophy--and I've found some reason to think no one really knows what he believes (a topic I may say more on later).