If you’re one of those new-fangled believers in introspection, a good mental discipline is to ask if – how – your habits of thought predispose you to notice certain evidence and ignore other types.
The next set of posts will use the critical rationalists for case studies. Each will describe a habit they seem to exhibit, then link it to one or more blind spots. You may have similar habits and similar blind spots. I do.
Some psychologizing is inevitable, which the critical rationalists would hate. Oh well.
This series on the critical rationalists has gotten a bit scatterbrained – poorly structured. That’s because I started out thinking critical rationalism, while flawed, was useful as a checklist for things to look out for while making or evaluating arguments. My opinion has morphed – not suddenly – into something almost the opposite: critical rationalism makes it easier for people with certain habits of thought to indulge them and thus make their judgments less rational.
This shift makes it appropriate to take stock and establish a new direction.
The story thus far
I described critical rationality in Popper by example. It’s around then that I realized that the critical rationalists really botched their analysis of Marxism – and in a suggestive way. So I provided some background to their claims in Prelude to a discussion of some blind spots. Then I looked at two historical scenarios where Marxists were supposed to have behaved badly. I showed, I hope convincingly, that they actually hadn’t. See Blind Spot 1: Immiseration and the pair of posts Background: actually, Marxists were better than that and Blind Spot 2: Revolution.
I picked the Marxism example because their comments were intemperate enough to catch my attention and explaining what they got wrong was easier than, say, dissecting their equally intemperate comments about quantum mechanics after 1925. The Copenhagen interpretation “led to a defeat of reason within modern physics and to an anarchist cult of incomprehensible chaos.” Lakatos, Criticism and the Growth of Knowledge, 1970, p. 145. (Full text.) I hope to work more science-ey content (my lay understandings of quantum mechanics and neurobiology) into the examples.
The argument going forward
The whole point of a methodology is to steer people away from their own weaknesses. Why did the critical rationalists' own methodology fail them? They’re supposed to be good at this stuff! So why did they miss or ignore historical evidence that was readily available even in the 1970s? You could argue that critical rationalism is a tool for evaluating scientists' behavior, and that I’m using it to evaluate people evaluating scientists' behavior. And that’s a sort of type error. I don’t buy it. Critical rationalism is promoted as a general approach centered around the question “How would you know if you’re wrong?”
My claim is that critical rationalism is a methodology that amplifies certain mental habits that encourage people to drop away from rationality and indulge in their biases. Each of the next set of posts will look at a particular mental habit or tendency and describe how it makes critical rationalists “see what they want to see and disregard the rest.“ Quote is a reference to Simon and Garfunkel’s song “The Boxer,” which uses the verb “to hear” rather than “to see.”
So I’ll present a set of views of the same topic: critical rationalism’s flaws. On a whim, I’ll use as an overarching title “36 Views of Mount CritRat.” That’s an homage to Hokusai’s “36 views of Mount Fuji,” the most famous of which is “The Great Wave”:
There won’t be 36 different posts (we all hope!), and the mental habit discussed in some of them won’t have much to do with the post’s illustrative image, but hey: boys just wanna have fun.