Bold ideas and critical thoughts on science.

Mike Schäfer & Jing Zeng on the particularities of conspiracy theories on COVID-19, how to face them, and what role science communicators play while doing so.

Q: Are there particular features to the Corona pandemic that makes it more attractive to conspiracy theories or conspiracy theorists? Or are we just much more aware of these theories because everyone is living through this at the moment?

Mike S. Schäfer

MS: In general, conspiracy theories are not a new phenomenon. If you go back in history you will find centuries-old conspiracy theories which people have used to explain various phenomena or events. Usually, these conspiracies involve elites or societal minorities who are seen as conspiring against the common interests. For example, in the Middle Ages, they were used in connection with catastrophes, droughts, situations of scarcity or pandemics. So the phenomenon of finding alternative, conspiratorial explanations for things for which other, more evidence-based and plausible explanations exist, is quite old.

What we are seeing now is a rise in conspiracy theories due to several factors. One is that over the past centuries, science, scientific evidence and the scientific epistemology more generally, i.e. the way science produces knowledge, is being questioned more often. This has many causes. But it is due in part also to science being more public. There is a general drive for scientists to communicate more, and with this, many more scientists and scientific institutions are more visible, and therefore are more debatable. This also means that there is a higher public profile for scientific disagreements, uncertainties, or even controversies.

With regards to the Covid-19 pandemic specifically, what we have now is a situation where political decision-makers urgently need to make wide-reaching decisions. These decisions require expertise in a situation where it is not readily available. We have seen a lot of studies on pandemics and on Covid-19 come out in recent months, but many of them have not yet been properly peer-reviewed. Many of them are preprints or have been uploaded on preprint servers, which makes the discussions of these papers much more visible than before. The Drosten case in Germany is a very good example of this, where the tabloid Bild scandalized the fact that Drosten and his team had a preprint out, and that there were partly critical peer comments on that preprint [1]. Of course, that’s not unusual in science, but because the process is taking place in the open, on publicly accessible servers, it’s very much more visible, and easier to manufacture a scandal or even a conspiracy, saying ‘oh, well this isn’t yet definitive knowledge, there is disagreement’.

At the same time, due to digital, social and mobile media, alternative explanations, even though they are often not evidence based or have little to no evidence at all, are more easily available nowadays. This is a situation that can become very fertile ground for conspiracy theories, and has contributed to us having to deal with a lot of conspiracies right now, around wearing masks, vaccinations, and the allegedly hidden interests of some opaque elites around these topics, as well as those around the virus itself and its origins. Was it produced in a Chinese lab with a certain agenda behind it, as some conspiracy theorists argue? They don’t produce evidence for that, but it’s a popular theory at the moment. And once these theories emerge, it’s easy to spread them online, and easy to find them for people who have an affinity for them. So social media has a role in both constructing these theories and distributing them.

Meg Jing Zeng

JZ: I would add that, if you look at the history of conspiracy theories, epidemics are a particularly interesting case. Since the Middle Ages, conspiracies have played a role when people don’t know what is going on, or what the cause of a certain disease was. Conspiracies were a way to find someone to blame. So in medieval Europe, people believed that witches conspired with the devil to cause epidemics, or they blamed minority groups, such as Jews. Now, while we are in a pandemic, we need to understand conspiracy theorising or rumour mongering as a very natural sense-making practice. When people are scared or are faced with uncertainty, they tend to collectively make sense of their reality, and rumour-mongering and conspiracy theorising are a mechanism for doing so.. And we are in a pandemic, which means this collective sense-making practice is on an international scale, with rumour and conspiracy theories inputs from all around the world. The scale of this health crisis also contributes to the scale and diversity of conspiracy theories we see today.

Q: As researchers and critical thinkers, how do we tread the line between questioning orthodoxies, and being able to judge what is just ludicrous, without slipping into conspiracy theory ways of thinking?

MS: There are plenty of examples of procedures for the critical evaluation of established theories or phenomena, both within and outside science. Within science, the normal procedure is what Robert Merton, the sociologist of science in the 1950s called organised skepticism. We have to be sceptical about findings, theories and explanations, and we have to test them again and again in order to find out which of them best approximate the truth. Outside of science, there are approaches like investigative journalism or now fact-checking, where testing theories or explanations is an important part of how society critically evaluates all kinds of stakeholders and decision-makers, including areas like politics and the economy, as well as science. These are necessary in a functioning society.

The thing about conspiracy theories, though, is that they immunise themselves against criticism. They often don’t subscribe to the basic mechanisms with which you can actually test facts or an entire epistemology. A conspiracy theory will posit an explanation, but no matter what counter-arguments are presented, the conspiracy just immunises itself against them. If evidence is presented that is contradictory to the theory, that evidence must be wrong or itself be produced by conspirators. So whatever facts are out there and might be used to try and contradict what we conspiracy theorists might be thinking, those facts are likely to be wrong or manufactured. They believe that there is a huge system manufacturing these fallacies, and even if there is scientific evidence supported by renowned scholars, they see it as a conspiracy.

Many conspiracies come up with evidence that is actually impossible to test, making impossible to come up with counter arguments. One of the hallmarks of conspiratorial thinking is that it doesn’t subscribe to the importance of evidence or the role that evidence testing plays in validating the theory. They are more beliefs than evidence-based theories.

Q: So conspiracy theories immunise themselves by removing the discourse from any formal structures, such as taking the discussion out of the mainstream media, where there are behavioural norms about how a discourse plays out, or out of the scientific context and into the realm of social media. Could you give a bit more detail about those mechanisms of immunisation?

JZ: Social psychology would describe it as the ‘self-sealing effect’ of conspiracy theories. Most conspiracy theory believers select the information first by assessing if it is compatible with their pre-existing beliefs, and then the credentials of the information itself comes second. This is why most scientists attempting to debunk conspiracy theories are automatically perceived as being part of  conspiring institutions and their argument loses credibility.

MS: John Cook, who works on climate change denial and conspiracy theories has come up with a taxonomy called ‘Techniques of Science Denial’  which consists of five mechanisms that people use when they deny science. He uses the acronym FLICC to describe the use of fake experts, logical fallacies, impossible expectations, cherry picking bits of information, and using conspiratorial thinking. The beauty of conspiracy theories for many of the people subscribing to them is that they provide certainty about things that are, or may seem, fundamentally uncertain. They provide certainty by putting an explanation out there for which they can mobilise a number of seemingly supportive sources of evidence. For that they cherry pick – they use the bits that fit into their theory, and they discard or deny the bits that don’t fit. And as I said, it’s very hard to falsify such conspiracy theories, because part of the basic assumption of conspiratorial thinking – and already embedded in the term ‘conspiracy’ – is the idea that someone is hiding the true facts.

Social psychologists have also done research that shows that, as well as producing certainty, conspiracy theories are very attractive to people who enjoy being part of an alleged elite. They make a  distinction between the few people who are in the know about what is truly happening, and all the other people – the ‘sheep’ – out there who are being tricked or misled or manipulated.

Q: So in effect, people create an elite for themselves, when they feel as though they are excluded from what they perceive to be another (perceived) elite?

MS: Yes, although it doesn’t always include elites. So governments, the WHO, or Bill Gates are perceived as elite, but there are also conspiracy theories that function not quite with elites, but which consider the conspirators to be within the society, among us, such as certain religious groups or wealthy groups. For example, people who attend the Bilderberg conference, who are not a formal elite, but are still a powerful group within society who are seen as steering or manipulating people.

Q: Uncertainty is something that scholars are familiar with, so do you think it’s useful for scholars to engage with conspiracy theories and theorists? Is it a worthwhile effort, and can it have an effect?

MS: This needs a differentiated answer. What tends to happen when one engages with conspiracy theories, is that you end up repeating their claims, and then try to debunk them. We know that if you repeat claims, just by putting them out there and giving them more traction, it can lead to people discovering these theories, finding them attractive and sometimes subscribing to them. Take, for example, the outcry that took place when conspiracy theorists had small demonstrations about Covid-19 – even though the media were saying ‘well, this is obviously untrue or outlandish’, the outcry did not necessarily help to combat conspiracy theories. The coverage gives them visibility and does not necessarily decrease followership.

When it comes to conspiracy theorists, there are degrees of belief. There are people who are deeply entrenched in conspiratorial ideas and groups. Convincing them otherwise is very, very difficult, and may not be worthwhile trying to do at all. If you want to engage in social media debates, for instance, about flat-earth theory, or Covid-19 related conspiracy theories with actual conspiracy theorists, it’s difficult. But it may still be worth doing it if your approach is doing it for the audience who are observing the discussion, and not to convince the actual, entrenched conspiracy theorists, even though they are the ones you are actually discussing with. Rather, scientists should do this for the people who are watching the discussion, who may be on the fence, or who may have some sympathy for conspiracy theories but who suddenly realise, because credible alternative views are put forward, that there are facts and other perspectives out there – this may be a way to get them back from the brink. It is also important that these interactions take place in an arena or social media channel that is fair and open and not controlled by ‘the other side’.

There is a lot of research that has been done around the questions of whether this kind of engagement is worth doing. John Cook and his colleagues are doing a lot of work on what they call inoculation – the question of whether it is possible to provide people with information and educate them about the mechanisms that conspiracy theories use before they actually encounter the conspiracy theory. If so, then it is possible that when they encounter these mechanisms, they can recognise them and be immune to them. The taxonomy mentioned earlier was developed, in part, to be used in an app, where people are presented with arguments and can then categorise these arguments, by recognising when, for example cherry-picking, or fake experts. This way, they become more familiar with the mechanisms that conspiracy theorists use, so that they are less prone to fall victim to them in the future.

JZ: I think there is also a very vivid example from the Corona pandemic. During the lockdown, the Netflix film Contagion trended, and people were talking about it almost like a ‘textbook’, looking for predictions of what might happen. The worst case in point is a pseudoscientific and conspiracy theory documentary called Plandemic (see the film’s Wikipedia page here – Eds), which went viral and was shared widely in various conspiracy theory groups we have been following. So when these kinds of ‘entertainment’ go mainstream, and are seen by certain audiences as being a textbook to understand what is going on, we find ourselves faced with a mass audience who are not necessarily conspiracy thinkers, but can be manipulated by this kind of pseudo-scientific content. After these contents went viral, what we saw was a lot of science practitioners and media commentators who went online and used various media outlets, or their own blogs, to debunk and factcheck what was being presented in this semi-entertainment or pseudo-documentary material. This kind of engagement can be very effective, because the buzz around these media productions make the public curious and eager to learn. . If there is a critical mass of audiences out there with open ears ready to listen, science practitioners can help them to unpack and understand what is being presented. I think that kind of engagement is very important.

Q: Does this mean that scientists and science communicators have to go where the conspiratorial thinking is happening, in order to engage with it on that level playing field? It certainly forces researchers out of the ivory tower, which is not necessarily a downside, but for some scholars that might be difficult. 

MS: It is. Going out of the ivory tower is certainly worthwhile. We not only do research on science communication, we also try to communicate, discuss and explain and see how people respond to our work. But as with all science communication, you have to be very aware of who your audience is, what your communicative aims are, and what the best means to reach those aims are. And, obviously, you have to do things differently for different audiences.

Many scientists assume that large public audiences essentially think like them – that they are interested and open to facts, willing to listen and prone to be convinced by evidence and argumentation. But for many audiences that is just not true. Some are just not interested in being lectured to. Others have developed very elaborate counter-theories that may not be evidence- or science-based, but hard to break down. If you talk to these audiences, be aware of what you do and maybe feed into, as well as what your chances are of reaching your goals. You can’t go on a conspiracy theory website and say ‘Well look, I have scientific evidence, and it shows this and that, and proves that obviously what you are thinking is nonsense’. Unsurprisingly, that doesn’t work.

Q: Do you think that conspiracy theory thinking poses a long term threat to society’s faith in scientific thinking and scientific knowledge transfer? And is this getting worse, or is it that the mechanisms may be different, but the conversations haven’t actually changed that much?

MS: Hard to say. The prevalence of conspiratorial thinking is hard to assess, especially for countries like Germany. We have more evidence for countries like the US, where you can see that some conspiracy theories are pretty widespread. For example, the chemtrail conspiracy theory, which claims that chemicals are being spread all over the country by airplanes, to manipulate people and keep them passive and obedient. Research shows that more than 10% of Americans think this is at least somewhat plausible. So a considerable part of the American population believe this, and many are prone to at least one or two conspirational ideas. We don’t know the extent to which this is the case in Germany. I don’t believe this thinking will go away, but we all can minimize it or propel it to more prominence depending on how we react. And the current reactions, both from scientists and from the media, are not always helpful to minimise conspiracy theories.

JZ: Conspiracy theories can be fast-evolving and transient, before science practitioners can effectively engage with them, updated academic research is required to understand how conspiracy theories evolve and operate in today’s media ecosystem. The chemtrail theory is a case in point. Some ‘older’ versions of the theory claim that chemicals were sprayed to suppress people’s critical thinking abilities, or to serve  the UN’s ‘depopulation’ plans. During the pandemic, the chemtrail theory has been updated and ‘upgraded’ to explain Covid-19.   According to the new theory,  nano technology-empowered ‘smart dust’  was being sprayed, and  once you breathe it in, it gets into your lungs, and then is activated by 5G. As social media and communication researchers, it is important for us to keep track of how these theories develop. Alongside working out how to debunk conspiracy theories, I think we also need to do a lot of research to understand how conspiracy theories operate in the digital environment, and to understand the political economy of conspiracy theory media outlets .We still need to understand how their business model works. For example, how do the most prominent conspiracy theorists capitalise on their theories, and profit from them? I follow lots of conspiracy theory podcasts, and I see a lot of them as ‘conspiracy theory entrepreneurs’. During the Covid-19  health crisis, one way they were making money was by selling products, such as  Nano Colloidal Silver and ‘anti-Covid’ vitamin supplements. There are certain links in the conspiracy theory dissemination chains that can be broken, like the monetising link. If you stop conspiracy promoters from making a profit by selling these products, for example, this may demotivate them  to a certain degree. But of course, that’s easier said than done. In our research, we have seen conspiracy theorists being banned from using mainstream money transferring service, but they can also find ‘alternative’ ways. Maybe after all, what regulators and platforms can do to combat conspiracy theorists is not about making their presence and business-model impossible, but making it hard, and harder.

MS: Many theories thrive on social media platforms. The ones we know, such as Facebook, Twitter and YouTube, are popular among conspiracy theorists, but also on other, less-known platforms exist like 8kun, where they find niches. In order to deal with these theories, you have to pressure the platforms into a more robust content moderation and better recommender algorithms, which is obviously very difficult. The platforms are often transnationally run, and it is difficult to get to them as domestic regulators. And the platform do not have an inherent interest in minimising conspiracy theories, as they create a lot of traffic and therefore a lot of revenue for them.

Also, we should not overestimate how easy it is to actually determine conspiracy theories. Essentially, if you want to pressure social media platforms, you have to define what a fact is and what a conspiracy is. In some cases that is easy to do, with flat earth theory, for example, but in others it’s not. We have to take into account that communication infrastructures have changed, and new platforms, like social media, have become intermediaries for communications.