YouTube to block all anti-vaccine video content
WASHINGTON, DC -- YouTube will block all anti-vaccine video content, including Covid-19 as well as measles, the chicken pox, etc., the company announced on Wednesday.
The Google-owned online video company is also banning prominent anti-vaccine activists, taking down several channels, according to YouTube’s Vice President Of Trust and Safety Matt Halprin.
The ban will include claims made that any approved medical vaccines are dangerous or lead to chronic health outcomes such as autism, explained Halprin.
The new policy marks a departure from the video site’s historically hands-off approach. To date, YouTube has let people broadcast most anything about vaccines.
YouTube’s new rule will have two caveats. Halperin indicated the company will allow “scientific discussion” -- videos about vaccine trials, results and failures. And YouTube will continue to permit personal testimonies, such as a parent talking about their child’s experiences getting vaccinated.
"Personal testimonials relating to vaccines will also be allowed, so long as the video doesn't violate other Community Guidelines, or the channel doesn't show a pattern of promoting vaccine hesitancy," YouTube said.
The announcement comes as the United States and other countries around the world have struggled to tackle misinformation that experts say contributes to vaccine hesitancy. The global rate of daily Covid-19 vaccinations has fallen recently to around 26 million doses per day. It also comes as Covid-19 vaccines for children are expected to be approved in the coming months.
YouTube's action is potentially significant because of its impact on the misinformation ecosystem. "A lot of the vaccine misinformation you see on other platforms links to YouTube videos," said Lisa Fazio, an associate professor of psychology and human development at Vanderbilt University who has studied misinformation. "It was a major loophole in our information ecosystem that it was so easy to post blatantly false information about vaccines on YouTube and have it gain large audiences."
In an effort to evade the previous YouTube bans on Covid-19 misinformation, bad actors had pivoted to posting more general anti-vaccine content to sow confusion and distrust in inoculations more broadly, Fazio said.
Some social media platforms have been criticized for not doing enough to address vaccine misinformation. The White House in July called on tech companies to ban the "disinformation dozen," a list of 12 people, who were identified by the nonprofit Center for Countering Digital Hate as being leading spreaders of vaccine misinformation. Multiple people on that list were among those YouTube said it took action against Wednesday.
Facebook in August said it removed dozens of pages and groups related to the disinformation dozen.
Under YouTube's new rules, users who post vaccine misinformation will be subject to its strike policy, which provides up to three strikes for content that goes against its policies posted within a 90-day period. The third strike leads to the user being permanently suspended. However, the company also says it may remove users after only one severe violation of the rules or when a channel is dedicated to violating the policy.