Our brains are susceptible to misinformation. They even encourage us to believe things that aren’t true. We take mental shortcuts, get confused, and create illusions. You may already be familiar with some of the psychological concepts that allow our brains to accept misinformation. Cognitive dissonance, first described in 1957, is one; confirmation bias is another.
Tommy Shane, of First Draft News, has lined out some additional key concepts that might explain why people are so vulnerable to believing misinformation.
The psychological feature that makes us most vulnerable to misinformation is that we are ‘cognitive misers. We prefer to use simpler, easier ways of solving problems than ones requiring more thought and effort. We’ve evolved to use as little mental effort as possible.
This is part of what makes our brains so efficient: You don’t want to be thinking really hard about every single thing. But it also means we don’t put enough thought into things when we need to — for example, when thinking about whether something we see online is true.
Dual process theory is the idea that we have two basic ways of thinking: System 1, an automatic process that requires little effort; and System 2, an analytical process that requires more effort. Because we are cognitive misers, we generally will use System 1 thinking (the easy one) when we think we can get away with it.
Automatic processing creates the risk of misinformation for two reasons. First, the easier something is to process, the more likely we are to think it’s true, so quick, easy judgments often feel right even when they aren’t. Second, its efficiency can miss details — sometimes crucial ones. For example, you might recall something you read on the internet, but forget that it was debunked.
Heuristics are indicators we use to make quick judgments. We use heuristics because it’s easier than conducting complex analysis, especially on the internet where there’s a lot of information.
The problem with heuristics is that they often lead to incorrect conclusions. For example, you might rely on a ‘social endorsement heuristic’ — that someone you trust has endorsed (e.g., retweeted) a post on social media — to judge how trustworthy it is. But however much you trust that person, it’s not a completely reliable indicator and could lead you to believe something that isn’t true.
“On social media, the heuristics (the mental shortcuts we use to make sense of the world) are missing. Unlike in a newspaper where you understand what section of the paper you are looking at and see visual cues which show you’re in the opinion section or the cartoon section, this isn’t the case online” (Wardle, 2019).
Cognitive dissonance is the negative experience that follows an encounter with information that contradicts your beliefs. This can lead people to reject credible information to alleviate the dissonance.
Confirmation bias is the tendency to believe information that confirms your existing beliefs, and to reject information that contradicts them. Disinformation actors can exploit this tendency to amplify existing beliefs.
Motivated reasoning is when people use their reasoning skills to believe what they want to believe, rather than determine the truth. The crucial point here is the idea that people’s rational faculties, rather than lazy or irrational thinking, can cause misinformed belief.
Motivated reasoning is a key point of current debate in misinformation psychology. In a 2019 piece for The New York Times, David Rand and Gordon Pennycook, two cognitive scientists based at the University of Virginia and MIT, respectively, argued strongly against it. Their claim is that people simply aren’t being analytical enough when they encounter information. As they put it:
“One group claims that our ability to reason is hijacked by our partisan convictions: that is, we’re prone to rationalization. The other group — to which the two of us belong — claims that the problem is that we often fail to exercise our critical faculties: that is, we’re mentally lazy.”
Pluralistic ignorance is a lack of understanding about what others in society think and believe. This can make people incorrectly think others are in a majority when it comes to a political view, when it is in fact a view held by very few people. This can be made worse by rebuttals of misinformation (e.g., conspiracy theories), as they can make those views seem more popular than they really are.
A variant of this is the false consensus effect: when people overestimate how many other people share their views.
The third-person effect describes the way people tend to assume misinformation affects other people more than themselves.
Nicoleta Corbu, professor of communications at the National University of Political Studies and Public Administration in Romania, found that there is a significant third-person effect in people’s perceived ability to spot misinformation: People rate themselves as better at identifying misinformation than others. This means people can underestimate their vulnerability, and don’t take appropriate actions.
Fluency refers to how easily people process information. People are more likely to believe something to be true if they can process it fluently — it feels right, and so seems true.
This is why repetition is so powerful: if you’ve heard it before, you process it more easily, and therefore are more likely to believe it. Repeat it multiple times, and you increase the effect. So even if you’ve heard something as a debunk, the sheer repetition of the original claim can make it more familiar, fluent, and believable.
It also means that easy-to-understand information is more believable, because it’s processed more fluently.
Bullshit receptivity is about how receptive you are to information that has little interest in the truth; a meaningless cliche, for example. Bullshit is different from a lie, which intentionally contradicts the truth.
Pennycook and Rand use the concept of bullshit receptivity to examine susceptibility to false news headlines. They found that the more likely we are to accept a pseudo-profound sentence (i.e., bullshit) such as, “Hidden meaning transforms unparalleled abstract beauty,” the more susceptible we are to false news headlines.
This provides evidence for Pennycook and Rand’s broader theory that susceptibility to false news comes from insufficient analytical thinking, rather than motivated reasoning. In other words, we’re too stuck in automatic System 1 thinking, and not enough in analytic System 2 thinking.
Pennycook, G., & Rand, D. G. (2020). Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking. Journal of Personality, 88(2), 185–200. https://doi-org.proxy.li.suu.edu:2443/10.1111/jopy.12476