EN

|

ES

Search

Post-truth in Science: When Emotions Obscure the Fact

From vaccine rejection to climate change denial, this reflection explores the causes of scientific misinformation and offers strategies to combat it.
Illustration
"Post-truth does not mean 'truth is dead,' but that it now competes with emotionally compelling yet false narratives." (Photo: Courtesy. Illustration: TecScience)

Por Silverio García-Lara

In 2016, the Oxford Dictionary chose post-truth as its word of the year, defining it as a situation in which “objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.”

Applied to science, this concept describes a growing problem in which scientific evidence is displaced by narratives that, although unfounded, go viral because they resonate with prejudices, emotions, beliefs, or particular interests.

Scientific post-truth not only distorts public debate and perception but also threatens global health, the economy, and democracy—ultimately endangering human survival. Recent examples such as the anti-vaccine movement, resistance to climate policies, or the banning of GMOs show how misinformation can have lethal consequences.
How did we get here? And what can we do to reverse it?

Identifying Post-truth

We must learn to recognize it. Post-truth in science is evident when scientific consensus is ignored in favor of fringe or discredited theories (e.g., flat-Earth beliefs).

It can also be seen when data is manipulated or selectively presented to support a narrative (climate change denial). Moreover, it manifests when emotions and group identity outweigh evidence (e.g., rejection of GMOs based on ideology, not data).

Mechanisms of Spread

Post-truth in science thrives through certain mechanisms. First, confirmation bias, the tendency to accept only information that aligns with preexisting beliefs.

Second, social media algorithms, which prioritize polarizing content—even if false—because it generates more engagement and “likes”.

Third, distrust in scientific institutions, especially when science is perceived by society as elitist, manipulative, or fraudulent, leading to growing adherence to pseudoscientific alternatives.

Critical Examples and Consequences

Despite vaccines saving millions of lives annually, baseless theories—such as their supposed link to autism—have reduced global immunization coverage. A fraudulent 1998 study by Andrew Wakefield fueled this myth, though he later retracted it and the study was discredited.

In 2019, the WHO declared anti-vaccine misinformation one of the top threats to global health. However, a study in Nature (2021) estimated that such rumors delayed COVID-19 vaccination efforts, prolonging the pandemic in several countries.

In the case of climate change, 99% of scientific studies confirm it is human-caused. However, interest groups such as oil companies and governments promote doubt to delay environmental policies, using tactics like outright denial (“The climate has always changed”) or highlighting dissenting scientists to suggest a false sense of legitimate debate.

Over 2,800 studies and reviews confirm that approved GMOs are as safe as conventional crops. Despite this scientific consensus and demonstrated benefits, myths persist, generating public rejection and restrictive policies.

This phenomenon illustrates how ideological interests can distort public perception, leading to policy outcomes such as GMO bans, like in Mexico.

The consequences are detrimental to public health and the global economy. On one hand, we see the resurgence of previously eradicated diseases, like polio, or new outbreaks of preventable illnesses, like measles. On the other hand, failure to address climate change is projected to cost up to 10% of global GDP by 2100.

Additionally, public policies based on misinformation are leading to budget cuts in science under denialist governments and bans on technologies like GMOs.

This phenomenon fuels polarization, turning science into a “matter of opinion” rather than an objective, rational consensus. A real consequence in the case of GMOs is reduced food quality and quantity in the absence of this technology.

Can We Fight Back?

Yes. First, we must improve science communication. With greater transparency, it is possible to explain uncertainties without resorting to false balance. Using emotionally positive narratives can help connect science with values—for example, “Vaccines protect your children.”

Second, we must promote scientific rigor, education, and critical thinking. There is an urgent need for stricter peer review to prevent the publication of fraudulent studies. We must encourage scientific and digital literacy so people can identify bots and deepfakes, and teach the public to evaluate sources critically (Who is saying this? Is there reproducible evidence?).

Third, regulation and responsibility from platforms and social media networks. While Facebook and X (formerly Twitter) have begun labeling rumors and misinformation, more rigorous and large-scale efforts are still needed. Media outlets must avoid scientific “clickbait” (exaggerated or misleading headlines), which often confuse the public rather than inform them.

In most scientific controversies, the solution is not just more data, but communicating science with empathy, dismantling myths without belittling legitimate fears (such as concerns over corporate control of technology or scientific fraud). The antidote is clear: more education, better communication, and zero tolerance for lies disguised as opinion.

Post-truth does not mean “truth is dead,” but that it now competes with emotionally compelling yet false narratives. This is not an abstract issue—it is real because it costs money, weakens societies, and threatens human survival. In the face of this, scientists, journalists, and citizens must form an alliance to defend the facts. Combating post-truth depends on an informed citizenry and institutions that value and uphold the truth.

As Carl Sagan wrote: “We live in a society exquisitely dependent on science and technology, in which hardly anyone knows anything about science and technology. This is a prescription for disaster.”

Recommended References

  • O’Connor, C. & Weatherall, J. (2019). The Misinformation Age. Yale University Press.
  • World Health Organization (2021). Report on Health Misinformation.
  • Mesnage, R., Agapito-Tenfen, S., Vilperte, V. et al. (2016). An integrated multi-omics analysis of the NK603 Roundup-tolerant GM maize reveals metabolism disturbances caused by the transformation process. Sci Rep 6, 37855. https://doi.org/10.1038/srep37855
  • Wakefield AJ, Murch SH, Anthony A, et al. (1998). Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children. The Lancet. (RETRACTED)
  • Tregoning, JS, Flight, KE, Higham, SL et al. (2021). Advances in COVID-19 vaccination: viruses, vaccines, and variants versus efficacy, effectiveness, and escape. Nat Rev Immunol 21, 626–636. https://doi.org/10.1038/s41577-021-00592-1
  • Lockwood, Gwilym. (2016). Academic clickbait: articles with positively-framed titles, interesting phrasing, and no wordplay get more attention online. Authorea. June 29. https://doi.org/10.15200/winn.146723.36330

Author’s Note

This text used the artificial intelligence tool DeepSeek (2025) to support its structure, correct concepts, and edit.

Author

Silverio García-Lara is a research professor in the Department of Bioengineering at the School of Engineering and Sciences, Tecnológico de Monterrey. He is a Level 3 member of the National System of Researchers.

Related news
Related videos

Did you like this content? Share it!​