What drives the herd? Why do even the brightest sometimes make choices that appear wholly irrational to anyone else? Why don’t we learn from some mistakes? How can we make more consistently rational choices in money and love?
Over many decades, psychologists have sought to answer questions like these through careful scientific research. The cumulative result of these efforts is now a refined understanding of what it means to be human.
Human choices, it turns out, are often not just irrational, but predictably irrational. But as the science shows us, by understanding these cognitive biases, we can turn this irrationality to our advantage.
What Is a Cognitive Bias?
Put simply, cognitive biases are predictable thinking errors that occur when processing information around us. Cognitive biases drive irrational actions and choices. They are the brain’s way of constructing a subjective reality that differs from the objective reality before us.
Why Learn About Cognitive Biases?
By learning about cognitive biases, we become better equipped to combat against them. Understanding and calling out our predictable irrationalities forces us to think twice before diving into decisions we might live to regret. Simple awareness can help us outsmart our own biases.
What’s more, this is supported by the research. One recent study found that a single training session on cognitive biases reduced bias in decision making by a third. Numerous other studies also support the idea that increased awareness of cognitive biases can improve financial decision making.
The Game-Changing Cognitive Biases
Psychology research has uncovered literally hundreds of cognitive biases. Too many for a single webpage. Thankfully, I believe the most important of these biases can be categorised into a much shorter list.
I call these biases the game-changing cognitive biases. Know about these, and you’ll have a bridge to understand other closely connected psychological concepts and biases. Know about these, and more importantly, you can jumpstart the process of thinking and deciding better.
With each bias below, you can click through to a more detailed article, which delves into some of the research and strategies to combat against them.
I. Beliefs: How our beliefs blind us
Confirmation Bias: The mother of all cognitive biases, confirmation bias is the tendency to search, interpret and recall information that confirms our prior beliefs, blinding ourselves from information that contradicts them. In an age of information at our fingertips, the internet serves as a belief-confirming or belief-challenging tool. But we can only beat confirmation bias by consciously seeking to test our ideas and by stepping out of our echo chambers.
Dunning-Kruger Effect: We tend to overestimate our ability. Research shows that excess confidence in our abilities is particularly pronounced for lower abilities. As we learn more, our confidence tends to reduce as we take stock of the gaps in our knowledge, and then rise again as we reach expertise. The lesson: beware know-it-alls. Confidence and knowledge are not to be confused.
II. Attribution: How we falsely assign causation
Hindsight Bias: Five dangerous words: I was right all along. Hindsight bias is our tendency to perceive events that already happened as having been more predictable than they really were. It can surface as distortions in how we remember events, or in reframing events as inevitable or foreseeable. Researchers have demonstrated this effect across investment, politics, terrorism, and criminal law, among other areas.
Proportionality Bias: We assume big events have big causes. And yet the truth is often much simpler. When we see a significant global event unfolding, like a pandemic, many assume widespread consequences have sinister motives, instead of the more probable explanation of incompetence. Proportionality bias captures this tendency to assume outcomes have proportional causes.
III. Framing: How presentation skews our choices
Default Effect: When presented with a selection of choices, we tend to exhibit a preference for the default option. Wide-ranging experiments show that making an option a default choice significantly increases the likelihood of it being chosen. This innate bias can be blessing or a curse. Marketeers can fool us into more expensive choices. But we can also capitalise on the default effect by automating smart financial decisions.
Contrast Effect: When making judgements, we exhibit a bias because of what we see or feel immediately before making that judgement. A positive contrast effect is where something is perceived as better than it is because we’ve experienced or observed something worse immediately before. The opposite is true for a negative contrast effect. Contrast effects can impact judgements across all walks of life.
IV. Anchors: How reference points distort decisions
Anchor and Adjustment: Anchors serve as reference points for estimates and choices. As a cognitive bias, anchoring and adjustment surfaces when we rely too heavily on an initial piece of information to make subsequent judgements. The core lesson: first impressions count, and they are everywhere. A high list price is an anchor, skewing estimates of real value. A first offer in negotiations is an anchor, skewing the results of negotiations. Even supermarkets have caught on, anchoring our experience to the first impression of the scent of flowers and the glow of fresh fruit at the front of the store.
Survivorship Bias: Success stories are inevitably more visible and more publicised. As a result, we tend to bank on what we can see, overestimating the proportion of successes to failures. Put differently, we focus on survivors and we overlook the rest. Survivorship bias is rife in a world that celebrates and promulgates success stories like never before.
V. Sunk costs and loss: How perceived loss skews judgement
Sunk Cost Fallacy and Loss Aversion: We hate loss. So much so, that seminal research found that losses are psychologically twice as powerful as gains. We respond to this dislike of loss by trying to avoid it. Because we’ve already invested time or money, we hold onto doomed investments for too long, we endure painful relationships longer than we need to, we finish books we shouldn’t bother with. Our aversion to loss and the sunk cost fallacy play havoc in tandem.
Endowment Effect and IKEA Effect: If we own it, we tend to overvalue it. And if we built it, we tend to overvalue it even more. Regardless of real market value, the sunk cost fallacy again plays a role in driving the endowment effect and the IKEA effect. These two effects are antithetical to a clutter-free life.
Effort Justification: We place more value on outcomes to which we have dedicated effort, despite a lower objective value. If we spent three years writing a book, those hours of work are likely to bias our valuation regardless of the book’s objective quality. If we passed a demanding initiation into a group, the gruelling entry may lead us to rate its members more highly than those who bypassed the initial effort. The danger in effort justification is its potential to breed false hope and years of misplaced effort.
VI. Time: How we undervalue the future
Hyperbolic Discounting: We prefer now over later. So much so, in fact, that even if rewards are objectively larger by waiting, we impulsively prefer smaller but sooner rewards. In effect, we excessively “discount” the value of future rewards. Behavioural economists call this hyperbolic discounting, and it’s prevalent across all walks of life. Importantly, research shows that our ability to delay gratification in favour of future rewards is a solid predictor for performance across a wide range of measures.
House-Money Effect: Our preference for immediacy perhaps explains why we treat newfound money differently. The house-money effect describes our tendency to treat money we’ve obtained easily or unexpectedly differently from money we’ve earned over time. It explains why inheritance (someone else’s hard-earned money) is not treated as hard-earned money.
How to Avoid Cognitive Biases
The question that remains, then, is how do we avoid these cognitive biases?
And the short answer is that we can’t avoid them all. We’re homo sapiens, after all. Not homo economicus. We cannot possibly take all life’s decisions like machines. Besides, sometimes our irrationalities aid rather than hinder our decision making.
The key to better decisions is to spot the pernicious thinking errors and check ourselves before they do more serious damage.
You can find more detailed ideas to combat against these biases within each of the corresponding articles above. But in broad terms, there are three things we can do to better position our thinking:
#1: Grow our understanding. Without knowing about cognitive biases, we’re less able to check ourselves when they begin to rear their ugly head. Knowledge of cognitive biases breeds increased self-awareness, and increased self-awareness breeds improved decision making. As we’ve already seen above, that’s well supported by the research.
#2: Develop a framework of mental models. A second important step is to build a repository of mental models. Mental models help us to see the world differently. When confronted by innate cognitive biases, they help solve problems or take decisions in a more systematic manner. They are mental bumpers that keep our ball in the alley. Learning about them can help take our problem solving to the next level.
#3: Find a reliable sounding board. Find that person(s) who tells it as it is. Bounce ideas and decisions off them. Use them as a rationality thermometer. When the heat rises to dangerous levels, they can help open our eyes to the fire of irrationality in front of them.
If you’re interested in learning more about cognitive biases, you can explore more via the links to articles above or check out the full archive of psychology articles on the blog.
For even more reading, I recommend the following three books as solid introductions to the psychology of decision making.
- Thinking, Fast and Slow by Daniel Kahneman. Along with his late colleague, Amos Tversky, Kahneman is considered the father of behavioural economics. This book provides a comprehensive introduction to the world of cognitive biases.
- The Art of Thinking Clearly by Rolf Dobelli. Dobelli isn’t a psychologist, but he is a persuasive storyteller. This book is a highly accessible read, with each short chapter neatly explaining a different cognitive bias. The book details 100 cognitive biases in total.
- Predictably Irrational by Dan Ariely. One of the first in the new age of behavioural economists to detail his research to the wider public, Ariely provides a compelling overview of some of the most important and predictable thinking errors.