SHARE

This story originally featured on Nexus Media, a syndicated newswire covering climate, energy, policy, art, and culture.

Rumors take hold after every crisis, whether it’s a global pandemic or a climate-driven disaster. Social media makes it easy. Anyone can post any story, true or not, and count on others to share it, particularly if it inspires anxiety, fear or anger.

While rumors can fuel stress, however, they are not meant to hurt other people, experts say. Rumors are simply a way for people to try to make sense of scary events, especially when they don’t have access to reliable sources of information.

“With rumors, it’s just people trying to figure something out. They are nervous. It’s just people sharing something they’ve heard, thinking it might help someone else,” says Kate Starbird, co-founder of the University of Washington Center for an Informed Public.

Disinformation, on the other hand, is something else entirely, and it can be especially pernicious during a crisis.

People spread disinformation to promote a political agenda or make a quick buck, for example, with hucksters now pushing everything from faulty face masks to block the novel coronavirus to worthless cures to treat it. That’s the key difference between rumors and disinformation–intent.

“When there is still uncertainty about whether or not something is true, it’s a rumor,” Starbird says. “If you know it isn’t true, it’s disinformation.”

For instance, Sen. Tom Cotton (R-AR) recently implied that the virus was cooked up in a Chinese lab, despite all evidence showing that it arose naturally. Observers have noted that such disinformation may be intended to deflect attention away from the Trump administration’s bungled pandemic response and onto a longtime adversary in the Chinese.

man walking on the street
Social media channels allow disinformation to spread. Pexels

Weather disasters are equally prone to disinformation. During Hurricane Harvey, for example, some said that authorities were checking the immigration status of people who went to shelters, a claim that may have been intended to dissuade undocumented immigrants from seeking aid.

“People who were vulnerable were panicking, getting angry and mad, and not willing to go to shelters because of the fear,” says Jun Zhuang, director of the Decision, Risk and Data Laboratory at the University of Buffalo. The Federal Emergency Management Agency quickly debunked that claim, among others, in an attempt to head off disinformation before it was widely shared.

“Social media is geared to grabbing people emotionally, to get more ‘likes,’ for example, which can heighten peoples’ perception of risk,” says E. Alison Holman, associate professor at the University of California, Irvine nursing school. “They get so alarmed, they spread it to other people.”

False reports of widespread arson persisted throughout the recent Australian wildfires, for example, providing ammunition for those who wanted to deny that climate change was a key driver. And it wasn’t the first time media reports blamed arson for wildfires.

In October 2017, when wildfires were ravaging Northern California wine country, anti-immigrant news site Breitbart circulated a report that an undocumented homeless man had started the fires.

While it was true that a homeless man had started a warming fire and was arrested for it, he was not responsible for the wildfires, according to the sheriff’s department. Sonoma County officials had to fight claims of arson as hard as they fought the flames.

“It turned into an international story that spread like—well, I won’t say it,” says Misti Wood, a spokesperson for the Sonoma County Sheriff’s department. “People were getting riled up that he had started the fires, and the information was so wrong. Just wrong.”

Officials need to share what they know, even if they don’t know everything, to maintain credibility during a disaster, says Gary Machlis, professor of environmental sustainability at Clemson University. Accurate, honest information—even when experts don’t have all the answers—can blunt rumors and disinformation, he said.

“It has to tell the truth, even if the truth is ‘I don’t know,’” he says. “Science, government policy and public health alerts need to be consistent and truthful and use trusted channels. If you do that, the public can better grasp and understand the gravity of a situation, and what to do about it.” Moreover, he says, elected officials should allow scientists to do the talking.

Sonoma County wildfires of 2017 seen from the air.
Sonoma County wildfires of 2017 seen from the air. California National Guard

President Trump, for instance, initially downplayed the magnitude of the coronavirus pandemic, even as it was hitting the US, then later touted the unproved benefits of an untested drug. The president has been such a font of disinformation that major news outlets are now cutting away from his press conferences.

“In a crisis like the COVID-19 pandemic, hubris is dangerous,” Machlis says. “Dim-wittedness is dangerous. Personal before public interest is dangerous. All three are dangerous, but combined in a leader, it can be catastrophic.”

Locally, experts say emergency management personnel should ramp up their use of social media in order to blunt disinformation.

“They need to do more than put out one tweet a day,” says Jeannette Sutton, associate professor of communication at the University of Kentucky’s Risk and Disaster Communication Center. “They have to remember they are competing with other news sources, and that we tend to pick and choose who we want to listen to, depending on our perspective.”

In the future, climate-driven catastrophes will become more frequent and diseases will continue to spread, making it critical to get accurate information to the public.

“These disasters are likely to be more costly and more severe,” Machlis says. “The people who are affected need—and deserve—the very best science, and it is the responsibility of the scientific community to deliver it.”