top of page
  • Julian Talbot

When Science Becomes Dogma

Examining the Reproducibility Crisis, Mask-Wearing Controversy, and Disagreements in Scientific Fields


Many of my articles are about practical risk management—the Bow-Tie model, how to define risk, etc. And that's valuable operational and tactical stuff, or at least I hope so.


But every so often, I like to write about the strategic risks. Impractical ideas, perhaps, but I find them interesting, so I write about them. I've been noticing what could only be described as own goals in the risk management world. And particularly in the scientific world.


Now, I'm not talking about when politicians situate the appreciation but coming up with arbitrary climate goals and telling scientists to make the numbers fit their model. Nor about the evolution of knowledge that takes us from Newtonian physics to quantum physics and beyond.


Instead, I'm talking about how science is getting tangled up in knots and becoming unreliable. This isn't news, but I'm talking about how we latch on to an idea and lock it in because it is ‘a scientific truth.’ Social media and old-school media deserve much of the credit for this. But it is human nature to seek certainty, even where there is little or none, including in science.


Now, to be clear, I'm not throwing stones at the scientific method of the scientists (well, not all of them). Most scientists are well aware of the limits of their understanding. Generally, far more than most of us. Richard Feynman described his view on the matter as follows.


I have approximate answers and possible beliefs and different degrees of uncertainty about different things, but I am not absolutely sure of anything and there are many things I don’t know anything about, such as whether it means anything to ask why we’re here. I don’t have to know an answer.” - Richard Feynman

For much of recent history, science has been regarded as a bastion of truth and progress, with the scientific method as a cornerstone of objective inquiry. However, science is not immune to the risk of dogma — the unquestioning acceptance of established beliefs. When scientific beliefs become dogmatic, they can stifle innovation, limit critical thinking, and hinder progress.


I don't think we are talking about the end of science, as some people might suggest. I make no promises that this isn't the end of science. I'm optimistic about our ability to get science back on track and keep learning, but we have a few problems.


In this article, I want to skim the surface of the reproducibility crisis, the controversy surrounding mask-wearing during the COVID-19 pandemic, and disagreements among scientists in quantum physics and paleontology to illustrate the dangers of scientific dogma.


The Reproducibility Crisis


The reproducibility crisis is a huge issue, and I wrote extensively about it in my book Future Uncertain. In brief, however, the reproducibility crisis (also known as the replicability crisis) refers to the increasing realization that independent researchers cannot replicate or reproduce many published scientific findings. Particularly in fields like psychology and medicine, many widely cited studies are difficult or impossible to replicate or reproduce. Yes. It's crazy, but true. Much of the research we rely on to manage risks is not repeatable.


2011 marked a pivotal year in psychology, particularly in the field of social psychology. This year witnessed three significant events:

  1. The exposure of Diederik Stapel, a social psychologist who fabricated data in numerous studies.

  2. The publication of a controversial paper in a renowned journal claims humans could predict random future events unconsciously.

  3. The release of Nobel Laureate Daniel Kahneman's influential book "Thinking, Fast and Slow," which summarized his work and other social psychology findings.

Daniel Kahneman's book, had it been written a few years later, might have differed significantly from its 2011 version. At that time, the majority of psychologists believed in the reliability of results published in their journals.


This perception shifted dramatically following the publication of Bem's 2011 study, which provided ostensibly credible evidence for paranormal phenomena—a concept widely disbelieved. The incident underscored a critical issue in the field: the presence of significant statistical results in a paper did not necessarily guarantee its trustworthiness.


It raises serious questions about the reliability and validity of research findings. Factors contributing to this crisis include publication bias, low statistical power, poor experimental design, and the misuse of statistics. Efforts to address the reproducibility crisis include promoting transparency in research methods, encouraging data sharing, and emphasizing the importance of replication studies.


Trust But Verify


How much misinformation is spread deliberately by vested interests and how much by well-intentioned but ill-informed professionals, I can't say. But let's take just a few of the many examples.


One of my favorite examples of disinformation is the book “The China Study“by Colin Campbell, PhD. This book gets 4.7 stars on Amazon from 4,800 reviews. I assume the positive reviews come from well-meaning people. And yet, the book is so flawed that I was about 30% of the way into it when I started noting internal inconsistencies.


I went back to the original research, as I tend to do, and found that much of the cited research was taken out of context or didn't support the particular point the author was trying to support. I won't waste your time here with all that is wrong with it. Many others have already written about it. Denise Minger does a great job in her article ‘The China Study: Fact or Fallacy?’


More recently, and more significantly for our society, is our understanding of public health interventions. At the onset of the COVID-19 pandemic, mask-wearing was widely promoted to reduce the spread of the virus. We were all told to mask up for the public good. Arguably, we didn't have enough information to say one way or the other. But we do now.


With the benefit of two years of pandemic, subsequent analyses cast doubt on the effectiveness of mask-wearing as a universal preventive measure. Research indicates that they don't even work in high-risk environments like hospitals.


The complete revision of what was once considered irrefutable regarding mask-wearing illustrates how scientific consensus can sometimes shift in light of new evidence. It underscores the importance of maintaining an open and critical stance toward scientific recommendations.


My point here is not that we were deceived. Rather, we made assumptions about the efficacy of masks and then presumed that it was an irrefutable truth. Health policies and medical advice across most of the globe were utterly wrong. There was no questioning of the accepted policy. People were slandered for questioning it. Even arrested for not wearing a mask.


All that is just the tip of the iceberg. I won't get into the vaccination question because the final analysis is still evolving. However, we do know that the advice that vaccines are more effective than natural immunity is demonstrably false.


Research now indicates that natural infection is 13 times better than vaccine-induced immunity. People who had natural infection but weren't vaccinated were 27 times less likely to get symptomatic infection compared to people who had been vaccinated but had not had a natural infection.


All of this is interesting and useful data. However, it raises questions about why public health authorities are still (at the time of writing) promoting information contrary to this evidence. The fact that nobody gets paid for natural immunity may or may not have anything to do with this.


"It’s okay to have an incorrect scientific hypothesis. But when new data proves it wrong, you have to adapt. Unfortunately, many elected leaders and public health officials have held on far too long to the hypothesis that natural immunity offers unreliable protection against covid-19 — a contention that is being rapidly debunked by science." - Marty Makari

As you can see, in some instances, what we see is not a failure of science. Instead, it's a misuse of science. The presentation of outdated information by vested interests as irrefutable truths. The willingness of many (most?) people to accept that an idea that the science on a particular matter is settled.


And that is the very definition of dogma. “A principle or set of principles laid down by an authority as incontrovertibly true.”



Disagreements Among Scientists


Disagreements within scientific fields are not uncommon, as differing perspectives can lead to robust debate and ultimately drive progress. In quantum physics, for example, disagreements persist over the interpretations of quantum mechanics, such as the Copenhagen interpretation, the Many Worlds interpretation, and others. Similarly, the nature of quantum entanglement remains a topic of ongoing debate.


In paleontology, controversies abound over dinosaur classification and evolution and the causes of their extinction. These disagreements highlight the inherent uncertainties and complexities within scientific disciplines and emphasize the importance of maintaining an open mind and a willingness to reevaluate existing theories in light of new evidence.


The story of Reginald Sprigg, an Australian geologist, serves as a reminder that scientific progress can be slow. It emphasizes the importance of remaining open to reevaluating established theories and ideas in light of new evidence. Sprigg played a pivotal role in the discovery of Dickinsonia, an enigmatic organism that provides valuable insights into the early evolution of life on Earth.


Sprigg discovered Dickinsonia fossils in the Ediacara Hills of South Australia in 1946. Despite many attempts by Sprigg to share the significance of this finding, it was not widely recognized, and it took several decades for paleontologists to appreciate its importance fully. It wasn't until the 1980s and 1990s that the true value of Sprigg's discovery was acknowledged, as researchers began to understand that Dickinsonia and other Ediacaran biota represented some of the earliest complex life forms on Earth.


The Dangers of Scientific Dogma


When scientific beliefs become dogmatic, they can stifle innovation and impede progress. By discouraging critical thinking and skepticism, dogma can lead to the perpetuation of erroneous or outdated ideas. To advance scientific knowledge, it is essential to cultivate an environment that values open debate, questioning, and reevaluation of established theories.


Encouraging Openness and Collaboration in Science


A lot of work is being done to address the dangers of science being confused with dogma. Several strategies are being adopted globally to promote openness and collaboration in the scientific community:

  1. More open debate and questioning of established theories foster an environment where challenging the status quo is not only accepted but encouraged.

  2. Emphasizing the importance of interdisciplinary collaboration, recognizing that breakthroughs often occur at the intersection of diverse fields of study.

  3. Promoting transparency and data sharing, enabling independent researchers to verify and build upon the work of their peers.

  4. Encouraging the publication of negative results and replication studies, reducing the publication bias that favors novel findings over confirmatory research.

There is much more that can be said about this, and I cover it in more detail in Future Uncertain, but hopefully, this whet your appetite to think more about the topic in your own life.


Conclusion


The balance between accepting scientific consensus and remaining open to questioning and reevaluation is critical for advancing scientific knowledge and in everyday life. This article is designed to help promote a culture of skepticism and critical thinking. All of us need to guard against dogma's dangers and continue pushing the boundaries of human understanding.


Openness, collaboration, and a willingness to challenge established beliefs are essential for ensuring the ongoing progress of scientific inquiry. I'll leave you with another favorite quote from Feynman.


“The first principle is that you must not fool yourself — and you are the easiest person to fool.”

Recent Posts

See All
bottom of page