Reality-Making: The Art of Distinguishing Truth from Falsehood
Understanding the constitution of knowledge (and the new attacks on it)
Why do you believe what you believe?
Over the last months, I’ve become fascinated by (a) how each of us comes to accept something as true and (b) how societies come to some kind of public understanding about truth. Playing around at the intersection of these questions has been immensely insightful for me. Today, I want to transmit some of that enthusiasm to you.
It will be a long ride, but your understanding of knowledge and truth will never be the same.
This essay has two parts. In part one, we’ll look at how we come to know we know what we know. I’ll introduce terms such as ‘indirect knowledge’, ‘a public understanding of truth’ and ‘the constitution of knowledge’. The second half focuses on a worry that our beautiful system of knowledge is breaking down. Ideas such as ‘post-truth’, ‘fake news’, and ‘filter bubbles’ will be central to that discussion.
No knowledge without others
How have you come to know what you (think you) know?
Assessing an idea’s legitimacy is hard work. If the only means of assessment is to verify the truth of it yourself, you’re doomed.
Which is why, if you examined even your most cherished convictions, you’re likely to find some authority or expert lurking down underneath.
Most of what we use to inform ourselves is knowledge accumulated by others that we import into our minds and adopt as our own. It’s indirectknowledge:
Every statistic you come across, every study you learn about, everything you read in a textbook, everything you learn in school, everything you learn from your parents, everything you see or read in the news, everything you read on social media, everything you hear a politician or celebrity say, every assumption of conventional wisdom — it’s all indirect information. — The Thinking Ladder (Wait But Why)
Because someone I trust says so
When we take in knowledge this way, we come to believe something on the basis of claims other people make.
I believe that there are three atoms in a molecule of water — H2O — because my high-school science teacher told me so. I’ve never detected the little H’s and O’s myself, yet I happily go through life believing that a water molecule contains one oxygen and two hydrogen atoms.
This isn’t laziness. It’s efficiency. And it’s necessary. Without this practice, individual knowledge would be confined to a tiny dataset of personal experience.
It would be weird to say my ideas about water’s chemical structure lack a proper foundation because I haven’t looked through the microscope myself. Indeed, in ‘epistemology’ — the branch of philosophy that deals with what it means for a belief to be true and for a true belief to count as knowledge — testimony is considered a source of default entitlement.
I trustmy chemistry professor (at least on this topic), so I take the information she’s worked hard for, say “thank you very much”, and essentially photocopy it into my brain. I’ve now made the knowledge mine. Similarly, I trust my newspaper, take the knowledge their war correspondent has risked his life for, and update my mental model about Syria.
However, indirect knowledge only works in your favor when you end up believing things that are actually true.
Surprisingly, then, one the most important truth-seeking skills is learning how to assign trust.
The constitution of knowledge
We do not normally notice the air we breathe. Similarly, we take for granted the climate of trust that is required to support much of our knowledge. He who does not trust cannot know.
To see how this works, let’s start with an innocent example.
You’re looking to buy a new ride but know nothing about cars. What to do? In True Enough, Farhad Manjoo proposes:
Consumer Reports has the resources to test every car on the market, and you do not. So if the magazine says that Volvo’s the way to go, you listen. This isn’t always a bad strategy. After all, how often is Consumer Reports wrong about some product endorsement? Rarely, or else it wouldn’t be as vaunted as it is.
Consumer Reports has a good reputation. That’s why we take for granted we can reliably defer to their judgment about which cars rock and which suck. It’s only given that the authority we’re relying on has a certain status that it makes sense for us to outsource our decision to it.
Compare: we can’t sensibly expect to end up with a true belief if we take some indirect knowledge about Volvo from my high school chemistry teacher who doesn’t have a driver’s license (though knew a lot about H2O).
Credentials matter. The entire edifice of trust-based indirect knowledge rests on it.
Because of a shared social ranking, we know when we can reasonably expect to import a true belief when we make someone else’s conclusions on a given topic our own. This is major progress we’ve made since the freewheeling days before radio and television launched the very idea of mass media — the era of partisan newspapers and pamphleteers.
(As we’ll see later, we’re in danger of undoing this advancement. In a time of fragmented media, although information now flows more freely than it did in the past, today’s news landscape also helps us indulge our biases and preexisting beliefs.)
Now scale up
My chemistry teacher was my knowledge-shortcut about ‘chemistry facts’. Consumer Reports allows me to know ‘car facts’ without me ever stepping into one. Now, what stands to general facts about the world in the way that chemistry teachers stand to chemistry facts and Consumer Reports stands to car facts?
Answer: the largest newspapers and national wire services, in addition to the broadcast TV networks, have long enjoyed unrivaled power in determining what all of us take to be true about the world. Most people rely on mainstream media institutions to deliver facts to them on a wide range of topics.
If some established papers report how immigrants are flooding your continent often enough, you slowly start believing it’s a thing (even if it’s not). If the news shows yet another clip of CEO fraud, you downwardly adjust your opinion of managers.
“We’ve already done the hard work to vet this information, and it’s safe to incorporate it into your beliefs without much testing.”
It’s hard to overestimate the tremendous power that authorities we trust have over our opinions:
For more than forty years, ABC, CBS, NBC, the Associated Press, and a half dozen large newspapers, including the Post, working in loose concordance, have collectively set the American news agenda … For decades, they guided the American people, to whichever topics they considered worthy of our attention, and we hung on their every word. Their power was legendary.Early in 1968, CBS’s Cronkite, a man Americans would have trusted with their checkbooks, ended a Tuesday evening telecast with his view that the United States was “mired in stalemate” in Vietnam. “If I’ve lost Cronkite,” President Lyndon Johnson remarked to an aide, “I’ve lost Middle America.”Johnson soon announced that he wouldn’t stand for reelection. — Farhad Manjoo, True Enough
If Consumer Reports says the new Volvo performs subpar, you don’t buy it. If your chemistry teacher says that water = H2O, you believe it. And if CBStakes stand against Johnson, you listen.
The social valve’s crucial role
To pull it all together, this is how knowledge works.
Since we can’t ascertain who’s right and full of shit ourselves, we rely on a kind of social valve — call it prestige and recognition — to tell us whose claims about what topic (not) to admit into our personal canon of knowledge. This fact — how we cannot but rely on such signals — indicates something very important: the ability to distinguish truth from falsehood is a collective ability.
Without such authority-conveying signs of trustworthiness, we’d all be starting from zero in doling out trust. We’d have no way to tell who’s likely to be right and who’s not. Indirect knowledge would be impossible.
“Individualism and falsity are one and the same,” the great American philosopher C.S. Peirce wrote. “One man’s experience is nothing if it stands alone.”
Knowledge: together, or not at all.
The two pillars of the constitution of knowledge
Thanks to this shared understanding I’ve been outlining, there used to be an agreed-upon benchmark of accuracy. If you came across a social media post claiming the truth of Pizzagate — a theory according to which Hillary Clinton was running a child sex ring out of a Washington DC pizzeria — you would treat it as exactly what it is.
And secondly, we used to agree which sources were the trustworthy ones. If you based your opinions on the Financial Times and CBS, you were taken a little more seriously than if you took your theories from some obscure subreddit. There was a shared hierarchy of sources.
This constitution of knowledge offers humans a beautiful knowledge-acquisition shortcut and saves them the effort and opportunity cost of re-vetting what has already been tested.
However, especially in the USA, a dangerous skepticism about this two-pronged public understanding of truth is on the rise.
This is what we’ll explore in the second half of this essay. We’ll see how it’s getting harder to check indirect knowledge both against (a) the facts and (b) agreed-upon indicators of trustworthiness. A dangerous cocktail.
Cognitive independence is a fool’s errand
To summarize our journey so far, almost all the learning you’ve done in your life has flowed into you through a trust channel. And since most of your knowledge is like this — it comes from other people rather than from your own experience — the quality of your knowledge is a function of how good you are at doling out trust. Here, again, we need others: it’s impossible to make informed decisions about who to trust if there are no shared standards of accuracy and trustworthiness you can consult for your own vetting process.
In a paradoxical way, this is true today more than ever.
Economist Tyler Cowen estimates that the single biggest recent change in Western life has been the dramatic decline in the cost and inconvenience of getting information.
Nowadays, sitting at McDonald’s, it is possible to consume virtually every possible point of view on just about anything before you’re halfway through your burger.
All this data is empowering, in some sense, because it gives us a peek into fields where only experts once dared to tread. As a result, there’s no need to blindly trust your chemistry teacher’s claims about water or the local pastor’s speculations about the age of the earth the UN’s theories on climate change. In principle, whatever information they base their assertions on, it’s accessible to everyone.
Now here’s the paradox.
While the possibility to upgrade indirect knowledge into direct knowledge — not based on what someone else told us but on our own verification — has never been closer to our fingertips, this gain is canceled out by today’s increase in complexity.
For instance, the mere fact that I need nothing more than an internet connection to read scientific papers that have been published ‘open access’, means next to nothing. Even though I can get my hands on the information, I’d need years of training and initiation to be able to make sense of it.
Perhaps in the abstract, the fact that we can peek behind the curtain is empowering. In practice, however, it’s more likely to lead to vertigo.
Another example: one of the central obstacles to implementing a policy response to climate change is that laypeople are unable to assess the science for themselves. Even worse is that, in the words of American sociologist Salvatore Babones, “Laws and regulations have become so complicated that ordinary citizens are cognitively incapable of grasping how their governments really work.”
In the absence of expert comment, then, we find ourselves drowning in a sea of facts divorced of meaning, trying to keep afloat in an ever-more-opaque world.
So paradoxically, even though we have access to more information than ever, our ability to check on the people we trust for our knowledge has decreased, not increased. Conversely, the role of trust has become bigger, not smaller.
We are not freer than ever. We are more dependent than ever.
There is no test for expertise available to the non-expert
That’s not all.
The experts we rely on, in turn, rely on vast networks of other experts themselves. A climate scientist analyzing core samples depends on the lab technician who runs the air-extraction machine, the engineers who made all those machines, the statisticians who developed the underlying methodology, and on and on.
Modern knowledge thus depends on trustinglong chains of experts. And no single person is in the position to check up on the reliability of every member of that chain. Ask yourself: could you tell a good statistician from an incompetent one? A good biologist from a bad one? A good nuclear engineer, or radiologist, or macro-economist, from a bad one?
Any particular person might, of course, be able to answer positively to one or two such questions, but nobody can really assess such a long chain for herself.
Problems for epistemic dependence on experts
If our most important truth-seeking skill is correctly assigning trust, we’d better be able to check for ourselves whether some expert we rely on speaks the truth. We have to identify and vet the authorities on whom we rely as to whether they get it right. But that, I’ve argued in the last two sections, is almost impossible.
This fact has huge implications.
If we can’t checkexperts’ claims, then not only do we depend on them for our indirect knowledge, but we’re also at their mercy.
For instance, since we can’t test IPCC’s theories about climate change or the dominant narrative about the 2008 financial crisis, there’s no way for us to tell if they’re true. We hardly understand these explanations in the first place. This means that if they’re wrong, we’ve been sheepishly led to believe something false.
This asymmetry is an undeniable feature of life that requires humans to accept their mutual dependence, their basic need for trust and the accompanying inescapable vulnerability.
And vulnerability means exploitability.
Which brings us to the next stop on our ride: it’s easier than ever to abuse the current conditions of the many-media, many-experts world to mislead us into believing false things.
Breaking the constitution of knowledge
As we’ve seen, most of our knowledge is indirect knowledge and indirect knowledge only helps you when you end up believing things that are true. For this to be the case, we need to trust the right people. However, since we can’t ascertain who’s right and full of shit ourselves, we rely on a kind of social valve to tell us which expert is reliable. Certain sources of information enjoyed a neutral authority: if you relied on them, you could assume the beliefs you ended up importing were OK.
In other words, we can only do our job of doling out trust when the social network does its job of doling out reputation.
In the previous sections, I pointed out how we can’t check experts’ claims against the facts. Less and less, we can’t check them against safe and reliable indicatorsof trustworthiness either.
We are plagued by what Seth Godin diagnosed as the end of reputation.
The end of reputation
We laypeople use social cues such as reputation, prestige, and recognition to make an informed decision about which epistemic authority to trust. Unavoidably so.
As for chemistry facts and car facts, so for general facts about the world. If I can’t rely on the social valve to tell me which is likely to get things right, I’m helpless.
If this social valve were to stop working, that would greatly increase the odds of me making mistakes in assigning trust and importing false information into my beliefs.
Well, that’s exactly what’s happening:
[More and more people believe] the core institutions and norms of American democracy have been irredeemably corrupted by an alien enemy. Their claims to transpartisan authority — authority that applies equally to all political factions and parties — are fraudulent. — Donald Trump and the rise of tribal epistemology (Vox)
As a consequence,
The US is experiencing a deep epistemic breach, a split not just in what we value or want, but in who we trust, how we come to know things, and what we believe we know — what we believe exists, is true, has happened and is happening. — America is facing an epistemic crisis (Vox)
By insisting that all the fact-checkers and hypothesis testers are phonies, some people discredit the very possibility of a socially validated reality, and open the door to tribal knowledge, personal knowledge, partisan knowledge, and other manifestations of epistemic anarchy.
As a result, citizens of all political persuasions can increasingly live in their own bubbles, consuming only views similar to their own, and rationalize their falsehoods by rejecting contradicting data by rejecting their source. Increasingly, we become so secure in our bubbles that we start accepting only information, whether it’s true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there. As a result, the fear goes, we’re more and more unable to spot false claims for what they are. Hence are becoming poorly informed and susceptible to manipulation.
Making sense of ‘post-truth’
Many people have claimed we’ve entered an era of ‘post-truth’. Not only do some politicians cultivate a blatant disregard for the facts, but their supporters seem utterly unswayed by evidence. It seems, to some, that truth no longer matters.
According to these commentators, the reason some folks embrace highly improbable conjectures is that they have fallen away from the ways of reason. They just don’t care about accuracy anymore.
This is an explanation in terms of total irrationality. To accept it, you must believe that a great number of people have lost all interest in evidence and truthfulness.
A better explanation, in my view, is that people who appear not to care about truth are not irrational, but systematically misinformed about where to place their trust.
Their background beliefs about who (not) to trust are have been hacked.
Listen to what it actually sounds like when people reject the plain facts — it doesn’t sound like brute irrationality. One side points out a piece of economic data; the other side rejects that data by rejecting its source. They think that newspaper is biased, or the academic elites generating the data are corrupt. — Escape the Echo Chamber (C Thi Nguyen in Aeon Magazine)
They haven’t stopped caring for truth. Rather, their indirect knowledge mechanism has been sabotaged. Their social valve is corrupted.
They’re victims of the end of reputation.
The huge role of trust means our constitution of knowledge is exploitable
‘Post-truth’ is what you get when people do the standard indirect knowledge thing (as we all do), but their trust filters are wrongly calibrated.
An ‘echo chamber’ doesn’t destroy their members’ interest in the truth. It manipulates who they trust and changes whom they accept as trustworthy sources and institutions.
People who skillfully manipulate today’s fragmented media landscape can lie to more people, more effectively, than ever before. In a world that lacks real gatekeepers and authority figures, and in which digital manipulation is so effortless, spin, conspiracy theories, myths, and outright lies may get the better of many of us.
“What you’re seeing and what you’re reading is not what’s happening.”
More people and more institutions are setting more and more of these traps, and they’re getting better at it. That’s terrifying: we’ve got a lot to lose here.
Is post-truth chatter the canary in the coal mine?
I know this is true because…
If anti-truth propaganda succeeds, the result will be a sort of epistemic wild west in which ‘the truth’ is wholly a matter of perspective and agenda. And we won’t only harbor false beliefs about climate change and the financial crisis, and stuff like Pizzagate, but about many, many more cases.
We should expect that we will get dumber and believe more and more false things. We’re vastly poor if the constitution of knowledge shuts down, and that’s what people are trying to do.
If our epistemic authorities are unreliable, we simply have no alternative but to hold less rational beliefs. Either we must then accept the testimony of unreliable authorities or we must rely on our own relatively inexpert and uninformed judgments.
If we care about truth, we must rescue it from the lawless rovers of the sea of tribal epistemology, echo chambers and alternative facts.
Ultimately, communication depends on a shared body of facts and on agreed-upon social decision-making about what is and is not reality. When truth itself feels uncertain, how can a democracy be sustained?