What makes confirmation bias irrational?
Confirmation bias is the human tendency to interpret new information in such a way that it confirms their prior beliefs. In Post-Truth, for example, the philosopher Lee McIntyre, for example, defines it as “the tendency to give more weight to information that confirms one of our preexisting beliefs.”
But what, precisely, is irrational about such a tendency?
If I were to see something now that looks like a polar bear, my preexisting belief that there are no polar bears near here will rightly lead me to give less weight than I otherwise would to this visual information. Analogously, there can be no real quarrel with a willingness to infer that studies supporting one’s theory-based expectations are more probative than, or methodologically superior to, studies that contradict one’s expectations. And being a total ‘blank slate’ with no assumptions or preconceptions is not a desirable or realistic starting point.
So if biases are dispositions to make false or irrational judgements, so-called confirmation bias is not a bias.
There are probably some conditions under which this form of reasoning fails, but it seems a normal - rather than biased - form of reasoning to me.
But maybe the real bias is not in how much weight we give to new information, rather it is in how we update our beliefs.
For example, there is some evidence that people with different preconceptions sometimes react in opposite ways to the same evidence. You and I both read one paper arguing for the irrationality of confirmation bias, and another arguing against it. After which I become more confident that confirmation bias is, in fact, rational. And you become more sure that it is not.
This one is more difficult. The first step is to note that the conditions under which this specific form of belief polarization occurs are much more nuanced than psychologists have long thought. It only occurs when people are able to find flaws in the uncongenial evidence. Without that, most people update their beliefs in the direction indicated by the new information.
But if we find (what we take to be) justified reason to explain away that information, we become more confident in the opposite view than this now-rejected information indicates.
This is like if Manchester United scores 1-0, but the goal is disallowed because Ronaldo was offside, now City goes 0-1 up. This doesn’t make sense.
Yet this might seem more rational on another analogy. Say you’re defending a city (aka your beliefs) against an invasion. You’ve just successfully warded off another attempt by the attackers to scale the walls. Now there are fewer soldiers left, so you grow more confident that the city will not be taken. Your increased confidence is a rational recalibration after the probabilities have changed.
Which analogy, you reckon, is more plausible?
Nicely argued. Thank you. However: it is applicable only to your selectively narrow definition of 'confirmation bias' which, per W'pedia, for example, is "the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values."