Misinformation may make disease outbreaks worse, researchers say - Action News
Home WebMail Thursday, November 14, 2024, 12:24 PM | Calgary | 7.1°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Science

Misinformation may make disease outbreaks worse, researchers say

Public health researchers in the U.K. say that the spread of misinformation during a disease outbreak may make the outbreak more severe. They hope their findings might help other public health officials dealing with disease outbreaks.

More people would get sick because of following bad advice, simulation shows

Customers are separated by a clear plastic panel on the table to prevent possible spread of the coronavirus in Hong Kong. A pair of researchers in the U.K. say their model suggests that the spread of false information during a disease outbreak may make that outbreak worse. (Kin Cheung/The Associated Press)

Public health researchers in the U.K. say that the spread of misinformation during a disease outbreak may make the outbreak more severe, and that reducing the amount of harmful advice circulating by even a little bit couldmitigate that effect.

Julii Brainard and Paul R. Hunter at the University of East Anglia reached that conclusion usinga simulation to model how the spread of information might affect the spread of three different viral diseases. Their results were first presented at a U.K. public health conference in 2018 but werepublished again recentlyin the journal Revue d'pidmiologie et de Sant Publique (Journal of Epidemiology and Public Health).

Thefindings could be relevant in light ofhow far and how fast misinformation has spread around COVID-19 in recent weeks.

The novel coronaviruswas first detected in Wuhan, China, in December and since then has killed more than 2,200 people and sickened tens of thousands more, according to figures from the World Health Organization (WHO). The outbreak has led to quarantines, travel restrictions and cancellations of events around the world prompted in part by an abundance of caution, and in part by misinformation about the virus's spread.

The WHOwas so concerned, it set up a "myth busters" page to debunk claims such as eating garlic or spraying chlorine all over your body can prevent coronavirus infection (they can't). Members of the International Fact-Checking Network, a group set up by the non-profit journalism organization Poynter, have run over 430 stories debunking claims around coronavirus since Jan.22.

How the simulation works

Brainard explained that there are many existing models for how disease can spreadwhich factor in the ways people move around and have contact with each other, and how often that can lead to illness.

"What we did differently in thisthat hadn't been done before waswe had information spread, but the information could be good or bad in terms of affecting behaviour," she said.

So in the simulation, "people could take both good advice on how to avoid contracting a disease, or they might take bad advice, anything from failing to wash their hands to actively seeking out someone who is ill."

An image from the World Health Organization debunking a myth that applying sesame oil on your skin can protect people from contracting COVID-19. (World Health Organization)

Brainard and Hunter used real-world data for influenza, norovirus and monkeypox, which all havedifferent incubation periods and recovery times. This was intendedto show that thefindings could be adapted to different diseases and different situations.

Brainardalso explained that they considered the spread of bad information cumulative in their model.

"So if you have good information and badinformation circulating, and if the bad information is kind of winning out, then over time you're going to drift into more and more bad habits," she said.

Misinformation led to more risks

In the first stage of the model, Brainard said, they assumed diseases spread without people changing their behaviour in any way, and then they measured how many people got sick.

In the second stage, half the people in the model were exposed to good information and half exposed to bad. The simulation was set upto replicate information-sharing patterns from a study published in the journalScience in 2018 which examined false newsspread on Twitterover an 11-year period. That study found that false news reached more people than truthful information, and it reached them faster.

The result in Brainard and Hunter's model was thatmore people got sick because they took more risks with their health due to following bad advice: in the influenza example, 82.7 per cent of the population got sick in the second stage, versus 59.2per cent in the first stage.

While in real life, fewer people would actually get the flu because they have prior immunity, the researchersassumed no one had prior immunity to the diseases in their model which is more likely to be the case with anewly identifiedvirus like COVID-19.

The model alsofound that reducing the amount of harmful advice circulating by just 10 per cent could reduce the effect of bad information on an outbreak.

In the third stage of the model, Brainard said, they assumed that 60 per cent of people receivedgood information and 40 per cent received bad information and even that small change resulted in sickness numbers dropping back to levels comparable to the first stage of the model.

The model also took into account that people must be in physical contact to spread the disease, but information about the disease can be passed along through social media and virtual contact.

Brainard saidshe was surprised that a relatively small reduction of misinformation in the model could change outcomes.

"I thought, you know, maybe we'd need a more dramatic difference," she said. "But that is basically saying just a small leverage in one direction or the other can actually have a big impact in terms of countering the effects of [misinformation]."

Still, Brainard cautions, her work is a model under developmentand hasn't been tested in real-world settings, something she would like to see in the future.

"I think it's just one step toward trying to figure out how ... we can influence the narrative and influence what sources of information people are using. It's only a start," said Brainard.