Coronavirus: How does misinformation spread, and how can we stop it?

Coronavirus misinformation is massively harmful. How, then, do we combat such claims?

 A man wears an anti-vaccine button as people and teachers protest against New York City mandated vaccines (photo credit: REUTERS/MIKE SEGAR)
A man wears an anti-vaccine button as people and teachers protest against New York City mandated vaccines
(photo credit: REUTERS/MIKE SEGAR)

For those who wish to combat COVID-related misinformation - such as the belief that vaccines contain microchips, or that the coronavirus vaccine causes infertility - it oftentimes feels like an uphill battle.

Oftentimes, the issue emerges surrounding unproven treatments for coronavirus. 

"Efforts to rapidly develop therapeutic interventions should never occur at the expense of the ethical and scientific standards that are at the heart of responsible clinical research and innovation," says Laertis Ikonomou, PhD, associate professor of oral biology in the University at Buffalo School of Dental Medicine, who was lead author on a paper discussing the misinformation and malpractice surrounding stem cell treatment for COVID-19. 

In the peer-reviewed study, the researchers explain that despite no evidence backing such treatment, clinics have begun to pop up and offer stem cell therapies that "promise to prevent COVID-19 by strengthening the immune system or improving overall health," according to the University at Buffalo.

"Scientists, regulators and policymakers must guard against the proliferation of poorly designed, underpowered and duplicative studies that are launched with undue haste because of the pandemic, but are unlikely to provide convincing, clinically meaningful safety and efficacy data,” says co-author Leigh Turner, PhD, professor of health, society and behavior at the University of California, Irvine.

 Anti-vaccine protestors hold placards during a march against coronavirus disease (COVID-19) vaccinations on the Sea Point promenade in Cape Town, South Africa (credit: REUTERS/MIKE HUTCHINGS)
Anti-vaccine protestors hold placards during a march against coronavirus disease (COVID-19) vaccinations on the Sea Point promenade in Cape Town, South Africa (credit: REUTERS/MIKE HUTCHINGS)

The researchers claim that through social media, it is far easier to distribute falsities and promote incomplete or under-investigated studies. Due to general panic surrounding the pandemic, these studies are often used to "exploit" patients' fears by clinics claiming to provide legitimate treatment. 

Stem cell therapies, according to the study, have led in the past to patients suffering blindness and even death. The treatments are also extremely expensive, leaving vulnerable patients financially harmed. 

The patients, after being told that the treatment they have taken is sufficient to prevent coronavirus, may choose not to get vaccinated, or perhaps to stop wearing masks or not to comply with social distancing guidelines, putting themselves only further at risk. 

"The search for cell-based COVID-19 treatments has ... been fraught with hyperbolic claims; flouting of crucial regulatory, scientific, and ethical norms; and distorted communication of research findings," the study says. "Rushed development and premature commercialization of cell- and gene-based therapeutics for COVID-19 and other respiratory virus infections and hyped communication of related clinical and research findings will inevitably harm the field of regenerative medicine, increase risks to patients, and erode the public's trust. Evidence-based approaches to developing safe and efficacious cell-based interventions and other medical products remain crucial even amid the challenges and intense pressures of the pandemic."

So coronavirus misinformation is massively harmful. How, then, do we combat such claims?


Stay updated with the latest news!

Subscribe to The Jerusalem Post Newsletter


Well, according to a January 2021 study, faith in science and scientists directly correlates with how much people believe coronavirus-related misinformation, meaning that a larger belief in the scientific method would indicate a tendency to express disbelief towards false claims about COVID-19.

According to a report by the London-based Center for Countering Digital Hate, 65% of the so-called lies stem largely from “12 anti-vaxxers who play leading roles in spreading digital misinformation about COVID vaccines.” These people have large numbers of followers, produce high volumes of anti-vaccine content or have seen rapid growth of their social media accounts during the course of the coronavirus crisis, the report said.

But perhaps more faith in the scientific process could lead to reduced belief in their claims.

A new study by researchers at Indiana University finds that even brief exposure to infographics on the scientific process may strengthen people's trust in science and therein reduce the impact of coronavirus-related misinformation.

The research, published in the peer-reviewed Journal of Medical Internet Research, finds that exposure of as little as one minute on how the evaluation of evidence is taken into account as part of scientific research may significantly improve one's belief in the scientific method. 

"There was also some evidence that the increase in trust also reduced beliefs in COVID-19 misinformation, through what is called a mediation effect," according to Indiana University.

Over 1,000 adults, composing a "nationally representative US sample by age, sex and race," were assigned at random to view either an "intervention infographic" about the scientific process or a "control infographic."

The researchers attempted to evaluate whether exposure to essential aspects of science - for example, why new evidence in research may cause scientists to change their argument - may prevent the spread of misinformation. 

"Our approach, if replicable, would potentially avoid those concerns by focusing more generally on trust and the scientific enterprise," said Jon Agley, associate professor at the IU School of Public Health-Bloomington, the lead author of the study. "The implication is that messaging to reduce the influence of misinformation may not need to address individual pieces of misinformation but could instead provide more general resistance to the influence of misinformation by speaking to misperceptions of science and scientists that might otherwise reduce trust."

For a good list of methods to fact-check coronavirus claims, check out Brian Blum's "11 ways to differentiate real, fake news about pandemic."