COVID-19 is no worse than the flu, climate change is a hoax and vaccines cause autism – from time to time, scientifically well-established results are rejected by many people. Here’s the science of why some people don’t believe in science and of how misinformation spreads.
As the world’s fight against coronavirus continues, it is also confronted with another challenge: misinformation. Managing the COVID-19 “infodemic” as WHO’s director-general Tedros Adhanom Ghebreyesus called it, is just as important as getting the spread of the virus under control, as downplaying the diseases can lead individuals towards ignoring public health advice.
Misinformation and the rejection of science are not a new phenomenon. Human caused global warming, the safety of vaccines and the carcinogenic effects of tobacco are just few examples of scientifically well-established results that are rejected by a sizeable proportion of the world population.
What drives people to reject scientific evidence and why do some go as far as believing in conspiracy theories instead? Here’s what we will cover in this post (please click the respective headings if you want to jump sections):
What drives people to reject science?
In certain cases, science I simply rejected because the public is not properly informed about an issue. However, the situation is not always as “simple” as that. In an article published in the journal Current directions in Psychological Science in 2016, the researchers Lewandowsky and Oberauer state that the denial or rejection of science is mostly driven by what they call “motivated cognition”. That is, people tend to reject scientific findings that threaten their worldview or core beliefs, such as religious or political opinions. By denying scientific evidence, people thereby protect their identity and beliefs.
There are three cognitive mechanisms that drive the rejection of science, which I will explain in more detail below (please click the respective headings to jump sections).

Image by hainguyenrp from Pixabay
Cognitive shortcuts
Cognitive shortcuts, also called heuristics, might be a concept you are familiar with if you have read Daniel Kahnemann’s popular book “Thinking Fast and Slow” (a great read!). Heuristics are mental shortcuts or rule-of-thumb strategies that help to shorten decision-making time. In many everyday life situations, heuristics help us to function without constantly having to think about our next course of action. However, heuristics can also lead to cognitive biases or systematic errors in thinking.
When being presented with science, we are often faced with novel facts, numbers and graphs that we need to interpret carefully to make sense of them. In this case, we are better able to understand the information if our brains do not rely on quick “heuristic” information processing, but instead engage in a controlled, deliberate and analytical method of thinking.
Most times, our brain does exactly that and we interpret information and science correctly. However, this is not always the case.
A study published in 2013 has found that if we are presented with data that is not congruent with our worldview, our brains tend to employ heuristics-based processing. In practice this means, that once people are presented with data that goes against their beliefs, they don’t process the information in a deliberate and analytical way. This can lead to misinterpretations and in extreme cases, doubting or rejecting science.
Differential risk perceptions
The second way in which motivated cognition may be expressed is as biased risk perceptions. For example, a study published in 2009 has shown that when U.S. participants are presented with information about risks and benefits of nanotechnology, liberals and conservatives show a very different assessment that is in line with their respective worldviews. While conservatives focus on the benefits of economic development, liberals tend to focus on environmental and health risks.
Other studies have advanced similar arguments about differential risk perceptions with regard to climate change denial.
Conspiracist cognition
The third element of motivated cognition is conspiracist thinking. According to the researchers Lewandowsky & Oberauer, “When people are motivated to reject an overwhelming scientific consensus, one way in which they may explain this consensus is via the ideation of a conspiracy among researchers.”
Conspiracy theories have skyrocketed since the beginning of the COVID-19 pandemic, which is concerning as it undermines trust in public health advice. According to a recent study by Imhoff & Lamberty, conspiracy beliefs about the pandemic being a hoax was associated with decreased self-reported containment behaviors, such as social distancing and handwashing. Furthermore, people who believed that the virus was intentionally created by humans were more likely to hoard food and sanitary products.

Conspiracist ideation is also known to be involved in the rejection of scientific findings with regard to human-caused climate change. According to a study published in the journal Psychological Science, 20% of U.S. residents agree with the statement that climate change “is a hoax perpetrated by corrupt scientists who wish to spend more taxpayer money on climate research”. Furthermore, when asked about their first association with the word global warming, the public tends to respond in ways of conspirational nature, naming words such as “hoax”. Last, climate-change-contrarian blogs were found to be permeated with conspiracist attributes.
In general, conspiracist thinking does not seem to be firmly linked to one or the other side of the political spectrum. On the contrary, a recent study suggests that conspiracist ideation is associated with political extremism irrespective of its polarity.
How does misinformation spread?
An important factor that contributes to the dissemination of misinformation in society are the media. With the growth of cable TV, the Internet and radio, it has become much easier for people to find news sources that support their existing opinions and views, something that is known as selective exposure. Especially the internet and social media play an increasingly important role in spreading information in today’s world. Internet users are not only passive consumers of information, but have taken new roles and actively create content on blogs, YouTube and social media such as Twitter or Facebook. If you are active on any kind of social media, you will probably know how quickly information can spread online with the help of likes, retweets and reposts. While this is can be helpful in keeping people up to date and connected, it also facilitates and amplifies the spread of misinformation online.
Let’s take the COVID-19 pandemic as an example. It is currently the first time in history that information about a pandemic is communicated with the help of technology and social media. Most of us keep up to date with current developments in our region via television news outlets, online media and social media. While this provides us with a lot of helpful news updates, we are also faced with a lot of fake news, rumors and conspiracy theories, especially on social media. From bogus remedies such as drinking bleach to myths about unsafe vaccines being or the virus having been engineered in a Chinese lab – all of these claims (and more) have spread rapidly through the online world of social media.

Photo by camilo jimenez on Unsplash
How misinformation spreads so rapidly has been the focus of a several studies. Research results from a study that looked into how Facebook users consume science and conspiracy information show that selective exposure to content and cherry picking are the main drivers behind rumors spreading. User tend to select and share only specific content (such as conspiracy information) and ignore the rest (such as scientific information). Furthermore, users tend to be exposed to and share selective content by friends who have the same “polarization”, which causes reinforcement and fosters confirmation bias, segregation and polarization
Another study looked at how verified true and false news stories spread on Twitter. The data used in the study comprised around 126,000 stories tweeted by ~3 million people more than 4.5 million times from 2006 to 2017. The researchers found that false news spread much farther, faster, deeper and more broadly than true news stories. This effect was especially pronounced for false political news in comparison to false news about science, natural disasters, terrorism or financial information.
Is it possible to correct misinformation?
Once misinformation has spread, it oftentimes persists for a long time. What causes misinformation to persist and why do attempts at correcting or retracting misinformation fail so often?

Photo by Vlad Tchompalov on Unsplash
Research points to the fact that this is a complex issue, as attempts at correcting misinformation can reinforce the misinformation itself (a so-called backfire effect). One explanation for this is that when scientists try to debunk misinformation, they repeat the misinformation in the process of doing so, which is thereby rendered more familiar. To avoid this, it is helpful to emphasize the correct information that is supposed to be communicated instead of the misinformation. Furthermore, it is helpful to use repeated retractions in order for the facts to become more familiar itself.
In addition, correcting misinformation can backfire due to the “true” information threatening the worldview of the audience (remember, as outlined above, people tend to reject scientific findings that threaten their worldview or core beliefs). In line with this, people who are not extremely fixed in their views will be more receptive towards information. If you have an audience that has very strong beliefs and whose woldview might be threatened by communicating true facts, it can be helpful to present content in such a way, that their worldview is affirmed (e.g., by focusing on potential benefits and opportunities rather than threats and risks).
Last, it is helpful to ensure that whatever material is used to debunk is simple and brief. Information that is simple and compelling tends to be more cognitively attractive and easily remembered, so it is important to make sure the true facts that are offered as an alternative to the misinformation are easy to grasp.
If you would like to learn more about how to identify debunk conspiracy theories, I would like to recommend the following free resources:
- The Conspiracy Theory Handbook by Stephan Lewandowsky and John Cook (also available in French, German, Spanish, Portuguese, Greek, Hungarian and Serbian here), which explains “why conspiracy theories are so popular, how to identify the traits of conspiratorial thinking, and what are effective response strategies”.
- the page “Identifying conspiracy theories” by the European Commission, which contains ten infographics aimed at helping citizens identify, debunk and counter conspiracy theories.
Further reading & resources
- Lewandowsky, S., & Oberauer, K. (2016). Motivated Rejection of Science. Current
Directions in Psychological Science, 25(4), 217–222. [Link] - Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz,N., & Cook, J. (2012). Misinformation and Its Correction: Continued Influence and Successful Debiasing. Psychological Science in the Public Interest, 13(3), 106–131. [Link]
- Vosoughi,S., Roy, D., & Aral, S. (2018). The spread of true and false news
online. Science, 359(6380), 1146-1151. [Link]