Coronavirus shows our health agencies are ill prepared for fake news
Disinformation spreads even faster than disease—and it can be just as deadly.
A Harvard-affiliated epidemiologist. The president. Untold (and mostly anonymous) people online. The outbreak of 2019 novel coronavirus has been clouded by false information from these sources and more, ranging from stretched half-truths to downright fakes. In a still-evolving public health situation like this one, such misinformation stokes panic and makes it harder to quell the spread of disease. But the Centers for Disease Control (CDC) is ill-prepared to combat digital disinfo.
“We really count on all of you to try to set the record straight,” CDC spokesperson Nancy Messonier, director of the National Center for Immunization and Respiratory Diseases, told reporters when asked about false information during a briefing last week. “We recognize that misinformation can rapidly spread, especially through social media, and I’m asking for your help to combat the spread of those rumors.”
Although the government body has a lengthy FAQ on its website about what is currently known about 2019-CoV, it does not specifically address false information, which is often called mis- or disinformation. The CDC did not respond to Popular Science’s request for further comment on the matter.
According to Nicholas Evans, a philosophy professor at the University of Massachusetts Lowell who studies false information and biosecurity, the CDC’s approach is to first release “very clear guidance on what they consider to be the best evidence-based policy for responding to the disease.”
That’s probably not going to help combat false information about a disease, he says—as demonstrated by the success of the anti-vaxx movement, simply presenting scientifically correct information isn’t enough to dispel an active disinformation campaign.
The other thing the CDC is currently doing, Evans says, is working to address specific false news stories when asked about them by journalists and public health authorities. Evans says that isn’t enough, either.
“I think that everyone is in a really heightened state at the moment, and the CDC or an authority like that could play a role in countering disinformation,” he says. At the very least, he adds, they should probably put up a webpage that identifies and debunks some of the most common myths about 2019-nCoV.
But even that wouldn’t necessarily solve the problem. “The resources that someone like the CDC would need to stage an active campaign of countering disinformation, unfortunately, are much much larger than the resources you need to spread that disinformation,” Evans says. “Bullshit tends to move faster than the truth and require fewer resources.”
Bad info also gets results for people seeking greater prominence, money, or even just chaos. “For a lot of people, the incentives come in part from attention,” says Evans. Take Eric Ding, a Harvard-affiliated nutritional epidemiologist whose inaccurate tweets have stoked panic about the virus for well over a week now. He started by tweeting about a now-revised preprint paper, meaning it had not been peer reviewed, that seemed to show that 2019-nCoV was highly infectious. A now-deleted tweet described it as “thermonuclear pandemic level bad” and Ding predicted “possibly an unchecked pandemic.” He gained tens of thousands of followers and was interviewed by numerous media outlets—despite the fact that his epidemiological expertise has nothing to do with infectious diseases. “His follower count went through the roof,” says Kent State University infectious disease epidemiologist Tara Smith, one of many scientists who have continued to call Ding out on Twitter. “I think it’s made things a lot more confusing for a lot of people.”
Because science is a specialized discipline, people who want to spread false information can twist its findings to their advantage, says Aditya Ranganathan, a public education specialist at UC Berkeley and member of the leadership committee at Public Editor. For example, look at the false claims being made by conspiracy theorist Alex Jones that 2019-nCoV was made in a lab. They rely on a flawed analysis that has been retracted by its authors. The scientific discipline has lots of tools, like peer review, to make sure that false information is corrected–but those corrections don’t always make it to the public. That’s why he believes it’s important for the public to learn to think critically and evaluate information the same way scientists are taught to. (Check out our guide to fact-checking dubious science headlines here.)
Jones’s media outlet InfoWars makes millions of dollars selling supplements and supplies for “preppers,” people preparing for the apocalypse. Outbreaks represent a big opportunity to stoke existential fears. And because most laypeople don’t know much about how scientific studies are evaluated and published, it’s easy for someone with a large audience to cherry-pick unvetted, incomplete, or entirely unfounded research and present it as “evidence” to the masses.
“Health is really where science hits the everyday person on a daily basis,” Ranganathan says, making outbreaks a particularly dangerous scenario. But governments have a long way to go before they’re prepared to combat false information online about health, he says.
The World Health Organization includes planning to combat misinformation in the checklist it creates for governments coming up with their own pandemic response plan. But at this point, the CDC’s public-facing efforts at combating false information are minimal.
Both the current situation and past outbreaks make it clear that’s a problem. False information about Ebola helped the disease spread across the Democratic Republic of the Congo—and may even have changed the course of the DRC’s election. In the current context, fears about the coronavirus have fueled racism and shaped government policy—including the U.S. travel ban, which runs contrary to what the World Health Organization recommends. We don’t know yet what will happen with 2019-nCoV, but the spread of all this misinformation isn’t helping. When the next pandemic comes, whether that’s today or in a decade, false information is likely to spread even faster than the disease.