🙉facts don't change minds? Identity-protective cognition and belief polarization almost never happen, so they do
Falsifiable Friday #4
Happy Friday! 😉
✝ According to Dan Kahan’s influential theory of “cultural” or “identity-protective” cognition, public disagreement over key societal risks (e.g., climate change, nuclear power) arises not because people fail to understand the science or lack relevant information, but rather as a result of the fact that people endorse whichever position reinforces their connection to others with whom they share important ties.
🤼 This is generally referred to as a specific form of motivated reasoning, namely, “identity-protective cognition”. Because espousing beliefs that are not congruent with the dominant sentiment in one’s group could threaten one’s position or status within the group, people are motivated to “protect” their cultural identities.
🧏♂️This means there’s no point in conveying unwelcome facts and evidence. Because people will probably not use their brains to figure out which beliefs are best supported by the evidence. But to figure out how to best justify assessments of the evidence that fit with dominant position within their cultural group. Studies, experts, and so on, that do not agree with this position X are dismissed on the grounds that, since they conclude not-X, they cannot be genuine experts and sound studies.
➡⬅ Because we weigh the evidence so differently, we’re stuck in our opposing camps and don’t converge by considering the same information. So exposing the public to information about the scientiﬁc consensus will only further attitude polarization by reinforcing people’s “cultural predispositions” to selectively attend to evidence.
🦯 The idea that such identity-protective cognition causes groups to drift further apart – and away from the truth – is popular these days. The New York Times, for example, has claimed that “Americans’ deep bias against the political party they oppose is so strong that it acts as a kind of partisan prism for facts, refracting a different reality to Republicans than to Democrats.” The common suggestion being that there’s no point in bothering with facts anyway because people will just dismiss them if they’re inconvenient.
✍ In this week’ research-packed piece, I show that Kahan’s work doesn’t support the popular notion that people are often not sensitive to evidence as a result of motivated reasoning so fact won’t change people’s minds anyway. This is the exception, not the rule. New experiments show that the specific conditions under which motivated reasoning and belief polarization occur are much more constrained than often assumed.