Last night was a night of particularly strange dreaming for me. As best I can remember, I was trapped in a "Groundhog's Day" (the movie) type cycle, only it was set on some experimental settlement of some sort which seemed to have a vauge Sci-fi element. I seem to remember a barn-like building which I thought of as a ship hangar. Weirdness. I don't remember enough details to make a coherent story. I'm not sure whether this is because it's hard to remember dreams, rather, I suspect its because most dreams aren't nearly as elaborate as we want to think they are. The my situation shifted dramatically, and I thought to ask myself whether I was still dreaming. After thinking about it for a moment, I decided my present stream of experiences were too vivid to be a dream. As it happened, those experiences involved crawling uphill on a sidewalk, on my stomach, and looking back on it the landscape wasn't really that elaborate or vivid.
Why does all this matter? It's because when we talk about Cartesian dream scenarios, we tend to forget that in dreams, our ability to reason is severely curcumscribed. If it weren't, almost every dream would involve quick and easy realizations that we're dreaming, but not all dreams work that way. The dream hypothesis--as well as the hypothesis of a powerful being that can induce dream-like states--is a threat not only to our sense data, it's a threat to our ability to reason about the world. And my experiences last night show that we have at least one instance of a human being's reasoning powers failing him when he was trying to show it was reasonable to think he wasn't dreaming. I didn't think I was absolutely certain I wasn't dreaming; I just thought in ordinary terms that I wasn't, just as I do now. And my reasoning, in retrospect, looks absurd.
Now leap, temporarily, to another subject. One attractive defense of the use of reason, or at a slightly more basic level our cognitive faculties, is that we can't help but do so. Any argument that calls into question our cognitive faculties will depend on them, and therefore be self-defeating, right? Or, there are times when, greatly disturbed by the difficulty of basic philosophical problems, I've contemplated abandoning concern for truth and devote myself to studying the psychology of persuasion, learning to manipulate people's beliefs as well as possible. It's not something that makes me feel good to think about, but I've contemplated it. The problem is that to study psychology presupposes a lot about truth, our ability to know things, etc. As tempting as it is to run in the face of intellectual difficulties, a retreat into the psychology of belief makes no sense.
This looks like an airtight argument for continuing the search for truth and trusting our basic cognitive faculties in doing so. Yet, as was just explained, it clear that they can go wrong, and can go wrong in basic, indeed shocking, ways. This obviously means they aren't infallible, but, even worse, it seems as if on some level, we don't have any idea whether they're reliable at all. It seems a self-defeating speculation. But it also seems an undeniable possibility.