Search

Unveiling Agnotology: Empowering Informed Societies

Demonstration for science

Author: Katharina Schwaiger, acib GmbH

In the digital age, where information floods our screens, the rise of agnotology has darkened the pursuit of truth. Agnotology, a term coined by Robert N. Proctor1, refers to both, the study of culturally induced ignorance and the deliberate spread of misinformation to create confusion and doubt2. But who are the profiteers of misinformation? From the tobacco industry’s deceptive campaigns to the corona crisis’ web of confusion, this article delves into the most significant danger of agnotology: the rejection of scientific evidence. How does agnotology hinder scientific progress, and what risks does it pose for society as a whole? To answer these questions, we first need to dig deeper into the agnotologists’ box of tricks.

Three tricks common in agnotology

 (in the following, the term “agnotology” is used specifically to refer to the deliberate spread of misinformation and does not encompass its scientific study)

 

To grasp the gravity of agnotology’s impact, let’s explore a few examples:

 

 

Trick 1: Discredit well-established scientific consensus by raising the bar for scientific support to an unachievable level.

 

Decades ago, tobacco companies sowed the seeds of doubt regarding the harmful effects of smoking on health. They accused existing studies, which were presenting conclusive results, of being inadequate because they tested the effects of smoke on animals and not on humans3. Also, epidemiological studies comparing the health of non-smokers and smokers were criticized to lack controllability3. The tobacco industry demanded absolute proof of harmfulness in humans, which in the end could only be provided by ethically unjustifiable controlled human trials. The trick is to exploit the self-correcting nature of science, i.e. the openness of science to refutations. However, while scientifically based doubt is epistemic, agnotological doubt serves the opposite purpose – it aims to obstruct knowledge acquisition and cast shadows of uncertainty over well-established scientific facts.3–5

 

 

Trick 2: Produce Fake Science – the science behind the unscience.

 

The following is a particularly perfidious example of the manipulation of a scientific study, involving the chemical compound Bisphenol A (BPA), commonly found in plastics. In a medical-biological lab experiment, researchers made a troubling discovery – cancer cells were mysteriously appearing in centrifuge tubes used in their experiments. Their meticulous investigations ruled out other potential factors, ultimately leading them to identify BPA (found in the centrifuge tubes’ material) as the culprit. BPA’s structural similarity to the estrogen hormone raised serious concerns, as previous studies on mice had demonstrated adverse effects, including prostate enlargement and reduced sperm production. In response to the researchers’ findings, the plastic industry, concerned about potential regulatory consequences, commissioned selective studies to downplay the health risks of BPA at low doses. These industry-funded studies turned to a particularly deceitful approach: They chose specific rats and mice that had been selectively bred to exhibit obesity and elevated estrogen levels for their experimental studies. As expected, these studies showed no significant effects, conveniently supporting the argument that BPA at low doses is safe.6,7

The danger of such subtle manipulations of scientific methods lies in their ability to deceive without raising immediate suspicion. Only experts well-versed in the specific fields can uncover these insidious tricks, making it challenging for the general public to discern fact from fiction.4

 

 

Trick 3: Cherry-pick only the sweetest data  

 

It’s a sly strategy where certain agnotologists pluck specific data that fits their agenda, conveniently ignoring the rest. In the context of the coronavirus pandemic, this trick caused tremendous distrust. As research was running in full speed, the number of publications was limited, corrections of already published scientific data were common and crucial. Hypotheses had to be rigorously tested and many refuted, especially during this early crisis time. In such circumstances, but not only then, it is unbelievably reckless for a single scientist to undermine the integrity of the scientific consensus by going directly to the media to spread their cherry-picking distorted opinion to stand in the limelight for once8. Most experts were trustworthy, but the few who were not, determined the picture as they gained particular attention in media outlets that prioritize sensationalism over careful reflection and fail to distinguish between opinion and information. It is the authority that scientists carry (or better carried?) that makes it so irresponsible; when they engage in cherry-picking, their misleading narratives can gain unwarranted credibility. This can lead to misguided decisions, public skepticism, and in the worst case to negative consequences for public health.9

What can we do?

Agnotology poses significant risks to our society, not only in the realm of public health but also across various domains. For instance, the long-standing climate change denial10 showcases the enduring impact of agnotology, despite overwhelming scientific consensus11. The rapid propagation of conspiracy theories, fake news and misinformation12, coupled with an ever-better networking of “like-minded” people, inevitably pushes extreme views to thrive within isolated echo chambers, causing social groups to drift further apart.13 The long-term consequences of this growing divide remain uncertain.

 

To prevent future consequences, a steadfast commitment to responsible information dissemination is crucial and scientific content shared via public domains, such as mass media, should undergo the same rigorous peer review principle that is essential in science to publish data. Upholding scientific principles of objectivity and reliability in communication, as suggested by renowned science communicator Mai Thi Nguyen-Kim in one of her YouTube videos (https://www.youtube.com/watch?v=Nn2rJrKwENI), can prevent influential bad actors from gaining undue influence by presenting opinions as expertise. Scientists and science communicators play a pivotal role in raising awareness about agnotologists’ deceptive tactics to counter the spread of regressive and anti-scientific ideas within society. Engaging citizens through citizen science, moreover, empowers the public to actively participate in studies and counter agnotology’s deception. Collaborative efforts between scientists, policymakers, and the public, exemplified by initiatives like acib’s Scientific Ambassador Program and planned citizen science projects, bolster trust in scientific methods and strengthen society’s resilience against misinformation.

For more information on agnotolgy, watch this documentary: https://www.youtube.com/watch?v=mWTv9Z-OMrY

References:

(1)      Robert N. Proctor | Department of History. https://history.stanford.edu/people/robert-n-proctor (accessed 2023-08-03).

(2)      Proctor, R. N.; Schiebinger, L. The Making and Unmaking of Ignorance. 2008.

(3)      Proctor, R. N. The History of the Discovery of the Cigarette e Lung Cancer Link : Evidentiary Traditions , Corporate Denial , Global Toll. 2011, 87–92. https://doi.org/10.1136/tobaccocontrol-2011-050338.

(4)      Foschungsquartett | Agnotologie – Die Tricks der Fake-Wissenschaft | detektor.fm – Das Podcast-Radio. https://detektor.fm/wissen/foschungsquartett-agnotologie?highlight=agnotologie (accessed 2023-08-03).

(5)      Fern, M. To Know or Better Not to : Agnotology and the Social Construction of Ignorance in Commercially Driven Research. 2010, 53–72.

(6)      Reutlinger, A. What Is Epistemically Wrong with Research Affected by Sponsorship Bias ? The Evidential Account. 2020, 1–26.

(7)      Carrier, M. Identifying Agnotological Ploys: How to Stay Clear of Unjustified Dissent BT  – Philosophy of Science: Between the Natural Sciences, the Social Sciences, and the Humanities; Christian, A., Hommen, D., Retzlaff, N., Schurz, G., Eds.; Springer International Publishing: Cham, 2018; pp 155–169. https://doi.org/10.1007/978-3-319-72577-2_9.

(8)      Quarks Science Cops Folge 1 – Die Akte Bhakdi – quarks.de. https://www.quarks.de/podcast/science-cops-folge-1-die-akte-bhakdi/ (accessed 2023-08-03).

(9)      Why People Cherry-Pick Science Data – It’s Happening With Coronavirus. https://www.forbes.com/sites/marshallshepherd/2020/04/18/why-people-cherry-pick-science-dataits-happening-with-coronavirus/?sh=3fb0dc9a1a21 (accessed 2023-08-03).

(10)    Lewandowsky, S.; Gignac, G. E.; Oberauer, K. The Robust Relationship Between Conspiracism and Denial of (Climate) Science. Psychol. Sci. 2015, 26 (5), 667–670. https://doi.org/10.1177/0956797614568432.

(11)    Do scientists agree on climate change? – Climate Change: Vital Signs of the Planet. https://climate.nasa.gov/faq/17/do-scientists-agree-on-climate-change/ (accessed 2023-08-03).

(12)    The Coronavirus Outbreak Is a Petri Dish for Conspiracy Theories | WIRED. https://www.wired.com/story/coronavirus-conspiracy-theories/ (accessed 2023-08-03).

(13)    Del Vicario, M.; Scala, A.; Caldarelli, G.; Stanley, H. E.; Quattrociocchi, W. Modeling Confirmation Bias and Polarization. Sci. Rep. 2017, 7 (January), 1–9. https://doi.org/10.1038/srep40391.

 

Picture credits: Unsplash