Social media rules. That’s bad in a pandemic
Popular social media posts are filled with inaccuracies about science. They could damage public health during this coronavirus pandemic, the authors of two separate studies say.
One study found that more than one in four of the most popular YouTube videos about the novel coronavirus contained misinformation. Another found that vaccine skeptics were winning the battle for Facebook engagement.
More than 70% of adults turn to the internet to learn about health and healthcare, a team of researchers in Canada said. They analyzed popular YouTube videos on a single day earlier this year, filtering for those that mentioned coronavirus.
Excluding videos that weren’t in English, that ran for more than an hour, or didn’t have audio or visual content, they wound up with 69 videos twith a total of 257,804,146 views. They rated each based on factual content covering symptoms, prevention, treatments epidemiology and viral spread.
The videos came from a variety of sources such as network news — which made up the largest portion of videos — entertainment videos, internet based news operations, professional YouTube stars, newspapers, educational institutions and government agencies.
Nearly 50 of the videos, or 72%, got the facts right. The one in four that didn’t had either misleading or inaccurate information, Heidi Oi-Yee Li of the University of Ottawa and colleagues in Canada wrote in the online journal BMJ Global Health.
More than 62 million people looked at the most misleading YouTube videos,
Past studies looking at YouTube usage found the platform has been key in spreading vital information about how to keep people safe in a pandemic or public health emergency.
If this many videos are inaccurate, there’s a “significant potential for harm,” Li and colleagues wrote.
“YouTube is a powerful, untapped educational tool that should be better mobilized by health professionals,” they wrote. Too often, government information is static and not interesting. Public health agencies could benefit if they were to team up with people who understand how to best communicate on YouTube, the researchers said.
In another study, researchers looked at scientific information on Facebook and found a similar static message from official public health leaders made these messages less impactful.
People who have not made up their minds about vaccines may be more influenced by what they see on social media this study published Wednesday in the journal Nature found, and that could be a real problem during the coronavirus pandemic,.
This research collaboration between scholars at George Washington University, University of Miami, Michigan State University and Los Alamos National Laboratory looked at comments from more than 100 million Facebook users in a variety of online communities that discussed vaccines during the 2019 measles outbreak. The conversations were spirited and the contributors spanned several countries and communicated in several languages.
Among Facebook users, opinions seemed to fall into three camps: people who were pro-vaccine, those who were anti-vaccine and the undecided. Even undecided social media users were still highly engaged with the topic, researchers said.
The researchers looked to see how individuals from one group interacted with the others, and created a map to track these conversations.
What they found was that while there were fewer people who did not believe in vaccines, there were nearly three times the number of anti-vaccination groups on Facebook. That larger number of groups, in part, allowed those communities to become more entangled with the undecided communities, which swayed some opinions. The anti-vaccine communicators tended to have a variety of messages and could join more conversations because of that variety.
“Even though they are numerically small, they appear big online because they have so many flavors of arguments and narratives,” said study co-author Neil Johnson. Some messages focused on ideas that vaccines caused health problems. Other messages emphasized free choice. Others spun conspiracy theories.
Pro-vaccine posters, like members of state public health departments, tended to concentrate their communication efforts on one message: vaccines protect public health. Having just one message cost them the opportunity to communicate with some of the medium-sized groups that weren’t as visible as others.
Johnson, a professor of physics at the George Washington University who heads the initiative in Complexity and Data Science, said that the team was about to wrap up the study when the Covid-19 pandemic hit. They continue to monitor these groups and found the distrust of establishment and science had transferred to the pandemic.
“It’s just morphed. It’s almost like it become the perfect storm for Covid, this kind of online behavior that distrust science, and because they’re already organized and embedded in groups like the local pet lovers association,” Johnson said. “Previously they may have had a hard job talking with the local pet lovers association about vaccines, but now everybody’s talking about Covid and possible vaccines, so it’s kind of their moment.”
Johnson said we tend to trust people in our own communities, so when our Uncle Arthur tells us something about our dog that is absolutely correct, that the vet hasn’t mentioned yet, you start to trust your Uncle Arthur most.
“We trust people in our communities because of this kind of interaction,” Johnson said. “Then when they turn around and tell me something about how Bill Gates is behind a particular vaccine, and you better watch out because he’s going to inject you with something, you might actually give it some kind of credence.”
The people who are spreading the anti-vaccine message online are not “crazies” or “flat Earth” people, instead they are people that are sort of “grabbing somewhere” and putting two and two together and “just getting the wrong answer,” Johnson said. But then “everyone around them thinks they’ve got the right answer.”
Johnson said he was “very skeptical” when he started the study.
He thought the conversations online would look like a battle between government establishment science, with health recommendation in the middle, and then small disorganized communities trying to pick away at it. But that’s not it at all.
“It’s more like the anti-vaxxers are embedded with the local pet club and with the parent teachers and you know, the establishment science health public health experts, it’s almost like they were sitting in an entirely different battlefield,” Johnson said. “And to them it looks like they’ve won, but they haven’t, because it’s just them on that battlefield.”
Johnson said he is already seeing people in these groups saying that they won’t get a Covid-19 vaccine and they will rely on others to be vaccinated so they will be safe. He hopes this study will help public health officials think through new communication strategies to reach more with their message.