2, 4, 6. This sequence of numbers follows a specific rule. Discover the rule by proposing strings of three numbers.
This problem is a paraphrased version of a challenge put to participants in a seminal study by Peter Wason in 1960.
The process went as follows. Each time the participants put forward a string of three numbers, Wason told them whether it conformed with the rule or not. They could try as many strings of numbers as they wished, but once assured that they had identified the rule, the participants had only one attempt to guess it.
For most, a typical pattern emerged. Participants put forward “8, 10, 12” or “3, 5, 7”, which Wason then confirmed as conforming with the rule. They then guessed that the rule was “each number was two greater than the previous”.
Others took a different route. They assumed the sequence was multiples of the first number, like “3, 6, 9” or “5, 10, 15”. Again, Wason confirmed these were valid sequences. These participants confidently declared that the rule was “multiples of the first number”.
Others tried sequences like “4, 10, 16” and “8, 20, 32”, which again conformed with the rule. Their guess was that “the middle number was the average of the three numbers”.
To their surprise, all these groups were wrong and yet convinced themselves they were right. Most participants were so focused on confirming their hypothesis that they didn’t seek to falsify it.
The real rule: the numbers were ascending. Simple as that.
What Is Confirmation Bias?
Peter Wason interpreted these results as demonstrating a preference for confirmation over falsification, and hence coined the term “confirmation bias”.
Put simply, confirmation bias can be defined as the tendency to search, interpret, and recall information that confirms our prior beliefs, blinding ourselves from information that contradicts them.
Confirmation bias is perhaps the most important of all our cognitive biases. It breeds fixed ideological standpoints that disregard evidence. It hinders our capacity to learn and progress. And as we deprive ourselves of the contradictory and shower ourselves with the confirmatory, it becomes harder and harder to dig ourselves out of the hole of solidified beliefs.
Confirmation Bias in the 21st Century
This isn’t a new problem. In all recorded history, our opinions have blinded us from the weight of evidence on the other side.
In 1620, Francis Bacon observed:
“The human understanding when it has once adopted an opinion … draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distinction sets aside or rejects[.]”
Yet, while confirmation bias isn’t a new problem, it’s perhaps now a bigger one.
In an age of information at our fingertips, the internet can be a belief-confirming or belief-challenging tool. And our natural disposition is to use it as the former.
Now we don’t just congregate in social circles that share our opinions, we find virtual groups and social networks that echo and amplify them. Now we don’t just read books that reinforce our views, we use search engines to filter and confirm them. Now we don’t scrutinise and then share news, we consume it live and race to spread it.
One of the main reasons we arrive in these echo chambers is because confirming our views massages our ego and hearing opposing views feels downright unpleasant. Modern tools are another means of amplifying the pleasant and dimming the unpleasant.
In fact, so unpleasant is our experience of opposing views, even incentives sometimes aren’t enough. In a 2017 study, two-thirds of participants refused money to listen to the other side’s arguments on same-sex marriage. And this degree of resistance to listen to the other’s side was consistent from both sides.
How to Control Confirmation Bias
To challenge the trap of confirmation bias, we need to challenge ourselves. Without conscious self-regulation, it’s not hard to find ourselves confined to the comfort of our opinions; it’s inevitable.
Here are some ideas to control confirmation bias.
#1: Delink identity and opinion
Opinions are most inflexible when we bind them with our identities. Seeing the other side, and then changing stance, is more difficult when we perceive these steps as retreats from our identities.
Sometimes, then, we need to extricate our opinions from our identities. But how?
In Think Like a Rocket Scientist, Ozan Varol suggests a subtle tweak in how we frame important beliefs. Instead of treating them with finality, which inevitably ties them to our ego, we should treat them as “working hypotheses”.
This idea is supported by the research. In a study of confirmation bias in criminal investigations, those who treated their hypothesis as a working hypothesis to be tested demonstrated less bias than those with who generated a firm hypothesis early on.
But embracing this change in spirit requires openness to contradicting ourselves. And it then requires conscious effort to test our working hypotheses.
#2: Seek to falsify
Confirmation bias obscures disconfirming evidence, and so we must work harder to find it and remember it. As Aldous Huxley put it, “Facts do not cease to exist because they are ignored.”
To test our working hypothesis, we must probe and ask questions of its validity. We must step out of the echo chamber and lift the veil of filters of confirming evidence.
As we saw earlier, this isn’t easy or natural. But it can be normalised. Testing big opinions can become part and parcel of forming beliefs.
We can make a conscious effort to write down disconfirming evidence when we confront it, to explore contrary ideas with more rigour. When we embrace this power of falsification, we can dramatically reduce our bias.
And sometimes a friend can help us out on that journey.
#3: Listen
We tend to congregate with people who hold similar opinions. Less arguments, more agreement. A deeper massage for our egos, a weaker challenge to the identities we’ve bound to our beliefs. Through the lens of evolutionary theory, it’s a logical way to organise our tribes.
But it’s unlikely that everyone in your life holds the same opinions as you. The world is now too big, too divided, and too information-abundant to permit this outcome.
Our job is to identify the people who have thoughtfully arrived at a different view and seek to understand why. In short, our job is to listen. We will not always listen and learn, but we may find there is more disconfirming evidence than we first thought.
Thoughtful differences in opinion can provoke thoughtful changes in opinion, provided we’re seriously open to testing our beliefs. Rarely, however, there can be a catch.
#4: Beware the backfire effect
When presented with disconfirming evidence, our confirmation bias can take things a step further. Contrary evidence can force us to believe even further in our original beliefs. Psychologists have coined this debated idea the “backfire effect”. It is worth some consideration.
One study examined the issue of political misconceptions. Some participants exposed to corrections of misconceptions were found to have strengthened misconceptions at the conclusion of the study.
Similarly, another study gave out information on the effectiveness of vaccines for children to parents against vaccination. The researchers found that sometimes these parents became more likely to have concerns about vaccines after receiving the information.
While the backfire effect has been disputed in the literature, it is worth considering the next time you receive information through your letterbox that challenges your beliefs.
Key Takeaways
- Confirmation bias is the tendency to search, interpret and information that is compatible with our beliefs, diminishing disconfirming evidence.
- It is not a new phenomenon, but abundant information now provides perfect ground for reinforcing this bias.
- To reduce confirmation bias, we need to delink identity and opinion, scrutinise and seek to falsify our working hypotheses, listen to those with opposing views, and beware the backfire effect.
For deeper reading on confirmation bias, I recommend Think Like a Rocket Scientist by Ozan Varol, Predictably Irrational by Dan Ariely, and Thinking, Fast and Slow by Daniel Kahneman.