Skip to Content

How health officials and social media are teaming up to fight the coronavirus ‘infodemic’

As health officials in a growing number of countries fight to slow the spread of the novel coronavirus, they’re also working to stem a secondary issue that the World Health Organization is calling an “infodemic.”

The WHO defines an infodemic as “an overabundance of information — some accurate and some not — that makes it hard for people to find trustworthy sources and reliable guidance when they need it.” The problem is aided by the ease and speed with which false or misleading information can spread on social media.

Coronavirus, also known as COVID-19, emerged in China in January and now has spread to more than 85,000 global cases with infections on every continent except for Antarctica. As the disease has spread, so too have false claims online about how it began, the number of people infected and promises of magical cures.

“In this particular case, with COVID-19, because of the growth of social media platforms in recent years, information is spreading faster than the virus itself,” Aleksandra Kuzmanovic, social media manager for the WHO, told CNN’s Brian Stelter on “Reliable Sources” Sunday.

In an effort to help people sort through the sometimes overwhelming amount of information online, Kuzmanovic said the organization is working directly with social media companies to ensure users are directed to trusted sources. Now, when social media users on a number of platforms, including Facebook, Twitter and Instagram, search for “coronavirus,” they are directed first to information from either the WHO, the Centers for Disease Control or their national health ministry.

The WHO is also working to produce information in a range of languages as the outbreak spreads around the world.

But as digital misinformation campaigns become increasingly sophisticated, the WHO and other world health officials should be doing more, said Seema Yasmin, director of the Stanford Health Communication Initiative and a former officer with the Centers for Disease Control’s Epidemic Intelligence Service.

“We’ve seen the spread of rumors and anti-science messages during Ebola, during Zika,” Yasmin said. “The anti-vaccine movement is not new, and WHO’s response often has been, ‘Oh, there’s a really bad outbreak of measles in Eastern Europe, it’s okay, we’re going to disseminate pamphlets.’ That’s not enough when the anti-science messages are sophisticated, targeting vulnerable populations and really tailoring anti-science messages to groups that believe them.”

Yasmin urged news organizations to put resources toward doing health and science journalism and encouraged health officials to be proactive in fighting the ongoing issue of health misinformation online.

Some social media platforms have independently taken further steps to curb misinformation and panic surrounding coronavirus.

Facebook said several weeks ago it would remove content with bogus cures or other false claims about coronavirus or posts that could create confusion about where accurate information can be found.

The company will “remove content with false claims or conspiracy theories that have been flagged by leading global health organizations and local health authorities that could cause harm to people who believe them,” Kang-Xing Jin, Facebook’s head of health, said in a blog post published the same day the World Health Organization declared coronavirus a public health emergency.

Last week, a Facebook spokesperson told CNN Business that the platform is working with its fact-checking partners to debunk false claims about the virus. Once Facebook posts and links are fact-checked and found to be false, the spokesperson said, the platform “dramatically” cuts their distribution. People who see this content, try to share it, or already have, are alerted that it’s false.

The company also says it will prohibit ads related to coronavirus that are misleadeing or aimed at profiting off of the panic surrounding the outbreak.

“For example, ads for face masks that imply they are the only ones still available or claim that they are guaranteed to prevent the virus from spreading will not be allowed to run on our platforms,” the company said in a blog post.

Those efforts point to a shift in Facebook’s response to false information on its platform in recent years, according to Steven Levy, editor-at-large for Wired magazine and author of the new book, “Facebook: The Inside Story.”

“In 2015, some people tried really actively to get them to take down the (anti-vaccination) stuff and it did not resonate with them,” Levy said. “They weren’t dealing at all with the concept of what we now call ‘fake news.’ But now they’re much more sensitive.”

However, Levy pointed out that the company’s reaction is still often dependent on public opinion.

“What seems to happen is that when there’s enough of an outcry, when Facebook’s practices are exposed, you know, people say, ‘Wow this can’t happen,’ Facebook will step in and say, ‘I guess we have to make an exception for this.'”

Facebook declined to comment for this story.

Article Topic Follows: Biz/Tech

Jump to comments ↓

Author Profile Photo

CNN

BE PART OF THE CONVERSATION

KVIA ABC 7 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content