Housefull Economics

Isle of Cognitive Biases

CC by Paul Hudson

Our weekly explainer on economics using lessons from popular culture. In Installment 53, Mayor Kobayashi falls prey to the Semmelweis Reflex.

The plot of the latest Wes Anderson movie Isle Of Dogs involves Kobayashi, an all-powerful mayor of a futuristic Japanese city, who banishes all the dogs of the city to a trash island. (There is a mythological backstory describing the Kobayashi clan’s hatred for dogs and reverence for cats.) The dogs are demonized by the politicians and authorities as carriers of disease, and a threat to the health and safety of humans. The media machine unleashes a massive misinformation campaign to demonize dogs for economic and political gain. Sounds familiar? It gets eerier.

During the course of the movie, a scientist discovers a cure for dog flu that would enable dogs to coexist with humans again. But in one of the most art-imitates-life moments in recent cinematic history, the scientist is found dead — under mysterious circumstances, of course. His research, however, finds its way to the mayor, who discloses it to his tight-knit circle of cronies and sycophants, who want this information to remain under the wraps.

They all fall victim to the Semmelweis Effect, which refers to“the reflex-like tendency to reject new evidence or new knowledge because it contradicts established norms, beliefs or paradigms.”

This effect or “reflex” is named after Dr Ignaz Semmelweis, a tragic character in medical history. In 1847, he was an assistant professor at a hospital in Hungary. He noticed that doctors were often delivering babies and treating patients with the same unwashed hands they performed autopsies with. He proposed that the doctors were contaminating patients with “cadaverous particles.” He then insisted that doctors wash their hands in a chlorinated lime solution before dealing with patients. This went against all accepted knowledge at the time. Doctors were perceived to be gentlemen and people whose hands were incapable of being dirty. Germ theory hadn’t been developed and proven, so the idea of microscopic harmful agents seemed absurd. Many in the medical community at the time were not only skeptical of Semmelweis’s findings, but openly mocked him, leading to him being carted off to a mental asylum where he eventually died.

This effect can be seen as a stronger, albeit related, version of the Confirmation Bias: the tendency to believe evidence that confirms your pre-existing beliefs. The Semmelweis Effect, on the other hand, is the rejection of any evidence precisely because it challenges your beliefs.

It is not intelligence that dictates this reflex but how deep a person’s beliefs run. Smartphones and computer operating systems are good examples. Apple users will be unwilling to consider evidence that Android phones allow more flexibility and experimentation. Android fans will not accept that Apple phones are more intuitive. And don’t even try arguing that old Windows-vs-Mac debate with either side.

Until the 1800s, geocentricity (the theory that the sun and planets revolve around the earth) and not heliocentricity (the theory that planets revolve around the sun) was the accepted and widely defended line of thought, so much so that Galileo, who was a proponent of the latter, was sentenced to house arrest by the Church.

The Semmelweis Reflex is shown to occur most often when we are presented with too much information. The brain makes use of this reflex to filter out information that it deems useless and incongruent with our existing beliefs and ideas. Nobel Prize winner Daniel Kahneman calls it “theory-induced blindness,” which means an adherence to a belief about how the world works, which prevents you from seeing how the world really works. From climate change deniers to proponents of higher government regulation to staunch supporters of politicians and political parties in the face of their failures, people reject scientific and demonstrative evidence all the time because they do not align with their own beliefs, and challenge the foundations of their mental models.

This reflex can also be viewed as a precursor to the Backfire Effect that makes people (always others, never yourself) double-down on their beliefs in the face of contradicting evidence.

Instead of viewing this reflex as being close-minded or stubborn, it can be viewed as the brain’s first line of defense against the constant barrage of information we are inundated by. Fighting this would require constant questioning of our own beliefs and being mindful of what we have accepted to be the truth or “fact” when it possibly just isn’t so.

Isle of Dogs also provides a prime example of being misled by the Binary Fallacy, by thinking that the choice between cats and dogs is an either/or option when it simply isn’t.

There is no easy, safe and non-threatening way to challenge established norms and beliefs, especially when they are based on fear and, in most cases, have turned out to be self-serving. The key is to take a leaf out of marketing books and package an idea better so that it is more palatable. New ideas that challenge existing norms must be gently presented in a way that is not threatening to those who would benefit from the knowledge – lest those ideas become rejected and ridiculed. In those situations, everyone loses.

We suffer from the Semmelweis Reflex every time we refuse to accept facts and instead rely on our prejudice or unfounded convictions. Harvard professor and four-time US Senator Daniel Patrick Moynihan once said, “Everyone  is entitled to his own opinion, but not his own facts.” But sometimes, we conflate our opinion for facts.

About the author

Reshu Natani

Reshu Natani is currently an Associate Research Fellow
at Nayi Disha, a platform for liberal political ideas. She is also a
former social entrepreneur who studied economics and public policy at
Meghnad Desai Academy of Economics. She occasionally uses Keynes’s
aphorism about being dead in the long run to justify her nihilistic