Our weekly explainer on economics using lessons from popular culture. In Installment 48, the characters of a hit show fall for a common cognitive bias.
At first glance, How I Met Your Mother might seem like an ensemble sit-com, but look closer and it is a show about how a father magically convinces his teenage kids to sit on a couch for 208 episodes and patiently listen to him, thus setting unrealistic expectations for parents of teenagers around the world.
Teenage idealism aside, How I Met Your Mother does offer us an insightful economic lesson.
In an episode titled ‘Spoiler Alert’, one quirk of each of the five protagonists is brought to light and announced to all the other four. For instance, Robin points out that Ted has a tendency to constantly correct people during conversations. As soon as she announces this to all the other protagonists, they start noticing this habit in their interactions with Ted.
The strange part was that for all the years spent together in the show’s universe, they had never noticed it, but now that Robin had told them about it, they started to seek examples of it in the rest of the episode. The show’s writers acknowledge this anomaly in the plot when Robin admits, “I never noticed this before, but now it is literally driving me crazy”. The protagonists I believe have fallen for what behavioural economists call the Confirmation Bias.
Confirmation Bias or My-Side Bias is a tendency to selectively look at evidence that supports one’s known claims or beliefs and ignores evidence to the contrary. The protagonists ignore all the times Ted does not correct someone and look for instances where he does, just to support their hypothesis. The concept was first spoken of in 1844 by German philosopher Arthur Schopenhauer in The World as Will and Representation where he observed: “An adopted hypothesis gives us lynx-eyes for everything that confirms it and makes us blind to everything that contradicts it.”
This phenomenon also explains the old adage: first impressions are last impressions. When we form an initial impression, our brain starts looking for and collecting evidence that supports that the impression. If contradictory evidence is observed, our brain tends to ignore it or bend it to conform to our original beliefs.
Part of a marketers’ toolkit
Marketers have been regularly using this to their benefit. For example imagine that Amazon, through masterful advertising and some targeted discounting, convinces you that it sells everything for the lowest price in the market. As soon as you start believing this, you will pay selectively only look at the instances where the e-retailer’s prices are actually lower than its competitors. In order for you to keep this belief alive, Amazon does not have to always ask you for the lowest prices. Instead, lower prices on just a few items will be enough to trick your mind. This is because you will always ignore the evidence that contradicts your hypothesis and keep looking for confirmatory evidence. As soon as you find it, you’re likely to stop looking.
Bane of online political discussions
Political discussions, especially on social media are plagued with confirmation bias as well. For example, if people get tagged as strongly pro-Modi or against him, it is easy for them to ignore opinions contrary to their own due to confirmation bias. Policy matters, thus, are often based on political affiliations rather than on merit.
Political arguments are even harder to win on social media, as confirmation bias is often followed by an extreme version: the Backfire Effect. Due to this effect, when people are presented clearly documented evidence against their opinion, they tend to commit even more strongly to their existing beliefs, refusing to change their views despite being provided persuasive evidence to the contrary.
How can we avoid it?
The legendary psychologist, and in many ways the father of behavioural economics, Daniel Kahneman tells us that none of us are immune to Confirmation Bias (including himself). But there is merit in trying to reduce its influence on our decision making. The Farnam Street Blog suggests a few simple questions to ask oneself as a disconfirmation exercise. Whenever you read an article or a piece of information (including this one) ask yourself the following questions:
- Which parts did I automatically agree with?
- Which parts did I ignore or skim over without realizing?
- How did I react to the points which I agreed or disagreed with?
- Did this post confirm any ideas I already had? Why?
- What if I thought the opposite of those ideas?
As you keep performing this exercise repeatedly you will be able to spot when you’re automatically accepting information simply because it fits in your existing belief system.
I believe that the Confirmation Bias is a universal aberration of the human mind. So I went online to find eminent intellectuals who confirm my belief, and found this Charlie Munger quote:
The human mind is a lot like the human egg, in that the human egg has a shut-off device. One sperm gets in, and it shuts down so that the next one can’t get in. The human mind has a big tendency of the same sort.