The Semmelweis effect is the reflex-like tendency to reject new evidence or new knowledge because it contradicts established beliefs, norms or paradigms. In “The Game of Life”, Timothy Leary[1] defined the reflex as follows: "Mob behavior found among primates and larval hominids on undeveloped planets, in which a discovery of important scientific fact is severely punished."
Ignaz Semmelweis was a Hungarian physician who discovered in 1847, twenty years before the introduction of the germ-theory, that childbed fever mortality rates fell from 20 to 2 percent when doctors disinfected their hands with a chlorine solution before attending a woman in childbirth. Despite strong empirical evidence, most of the medical world rejected his theory for the wrong medical and non-medical reasons. In the early to mid-1800s, doctors argued that diseases resulted from imbalances among four humors: black bile, yellow bile, phlegm, and blood. According to the medical science of that time, each healthy person had a perfect balance of these four humors, and only an imbalance between the humors could cause a disease. Semmelweis's findings of unhygienic practices contrasted with the theory of humors and were, therefore, by definition wrong. Other doctors believed that a gentleman's hand couldn't transmit diseases.
As often is the case with people trying, for good reasons, to change existing beliefs, life didn't end well for Semmelweis. He was committed to an asylum for the insane and died a lonely death shortly afterward.
Note from the writer: I feel fortunate not to have been touched by gentleman's hands when I was born more than one hundred years after Semmelweis's death.
Does the Semmelweis Reflex exists in politics?
Deny, deny and deny. There is probably no place where the Semmelweis reflex is more apparent than politics, where denial of facts is a daily practice. Anything said by political opponents, per definition, must be rebuked as wrong or deserves an alternative narrative. Why? Because recognizing that one side is right causes the other side the embarrassment and votes (!) of admitting being wrong. There are tons of examples where cynicism, ignorance, misunderstanding, cognitive laziness, stubbornness, or emotional considerations all the way to stupidity and sheer irrationality led to catastrophic consequences. A few of these examples:
- In the early months of 2020, the world was bombarded by endless public debates about Covid. Although not much was known about the disease and its spread, empirical evidence showed that countries implementing social distancing and working from home, curtailing public events, and stimulating the wearing of mouth masks had lower contagion rates than other countries. Still, some political leaders would publicly argue against these measures' effectiveness for containing the disease and refused to take the necessary decisions.
- It took politicians decades to introduce policies against smoking. Their firm belief was that smoking did not cause cancer despite the abundance of evidence to the contrary.
- Also, let's not forget the climate change deniers.
- And, the election fraud claims in the 2020 United States presidential elections.
How about the Semmelweis reflex in business?
Countless companies have gone out of business due to owners or managers' tendency to hold on to their beliefs even when a changing world showed that they should not.
The neighbors in my hometown in the Netherlands owned the largest coal stoves manufacturing company. They went slowly but surely out of business in the 1960s with the introduction of central heating systems fueled by gas oil. Despite overwhelming evidence to the contrary, they refused to believe that it was safe to install these heating systems in houses' basements. The more people changed from coal stoves to central heating, the stronger they raised their voice against it. Instead of retooling their company, they continued to manufacture coal stoves until they finally had to close the company's doors.
Eastman Kodak became another dinosaur of business by believing that their one hundred years of color film dominance would not be affected by the introduction of digital camera technology. The company filed for bankruptcy in 2012.
The question remains if the big petroleum companies will become extinct in the future if they don’t develop and execute a retooling-to-alternatives-program.
So, what can we do to reduce the Semmelweis reflex?
To find an answer, we should first try to understand what's going on in a person’s mind.
In the Encyclopedia for Social Psychology, Craig Anderson[1] describes three psychological processes.
One involves the use of the "availability heuristic" to decide what is most likely to happen. When judging your ability at a particular task, you are likely to try to recall how well you've done on similar tasks in the past. But whether you recall more successes or failures depends on many factors, such as how memorable the various occasions were and how often you've thought about them, but not necessarily on how frequently you've actually succeeded or failed.
A second process concerns "illusory correlation," in which one sees or remembers more confirming cases and fewer disconfirming cases than really exists.
A third process involves "data distortions," in which confirming cases are inadvertently created, and disconfirming cases are ignored.
The physicist Max Planks stated that new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with that new truth.
Maybe, a lifetime is a bit too slow for modern humanity, and we should apply faster working techniques to implement changes?
Research shows that the most obvious solution, asking people to be unbiased, doesn't work. However, several techniques do reduce the problem. Some of the most successful are to get the person to imagine or explain how the opposite belief might be true. This de-biasing technique is known as "Counter-explanation."
Ross, Lepper, and Hubbard[2] suggest that a crucial thinking skill called "Inversion" may be the best remedy. Inversion allows us to challenge our beliefs by considering the opposite side. Examples of beliefs inversion are:
- How can I create products that everybody hates?
- What can I do to bring products slow to market?
- What can I do to prevent innovation at my company?
Maybe the most important lessons of effectively dealing with change come from Spencer Johnson's book “Who Moved My Cheese?”[3] The main character from the story teaches us,"The quicker you let go of old cheese, the sooner you can enjoy new cheese."
Dedicated to Peter, my friend and cheese lover from Vienna who recently told me the life story of Ignaz Semmelweis
[1] Craig A. Anderson, Ph.D., Stanford University, 1980, isa Distinguished Professor of Psychology at Iowa State University; Director,Center for the Study of Violence; and Past-President of the InternationalSociety for Research on Aggression
[2]Perseverance in self-perception and social perception, Journal of Personalityand Social Psychology, Stanford University 1975
[3]Spencer Johnson, Who Moved My Cheese, 1998
[1] TimothyLeary, The Game of Life, 2015 New Falcon Publications