Skip to Content

Facebook Oversight Board reiterates calls for Meta to be more transparent


CNN

By Clare Duffy, CNN Business

The independent board that oversees Facebook parent company, Meta, is once again urging the social media giant to be more transparent about its content moderation decisions.

The Facebook Oversight Board on Thursday released two rulings overturning Meta’s decisions to remove user posts from its platforms, saying the content did not actually violate the company’s policies. In both decisions, the board recommended that Meta provide more information to users about actions it takes on their content.

The Oversight Board is comprised of experts in areas such as freedom of expression and human rights who are appointed by the company but operate independently. The board, which reviews user appeals of content decisions on Meta-owned platforms, is often described as a kind of Supreme Court for Facebook.

The board has been making similar calls for transparency since its decision in May to uphold Facebook’s suspension of Donald Trump. In that ruling, the board agreed that Trump had severely violated Facebook’s policies but criticized the company for its initial indefinite suspension of the then-President. The board asked Facebook to clarify how its actual policies applied to the situation, saying that “In applying a vague, standardless penalty and then referring this case to the Board to resolve, Facebook seeks to avoid its responsibilities.”

“They can’t just invent new, unwritten rules when it suits them,” Oversight Board co-chair Helle Thorning-Schmidt said in May about the Trump ruling. “They have to have a transparent way of doing this.”

In October, the Oversight Board released its first quarterly transparency report, noting that “Facebook isn’t being clear with the people who use its platforms,” and calling on the company to give users more information about content decisions. The board also criticized Meta for withholding crucial details about its “Cross-Check” program to review content decisions related to high-profile users, following reporting on the program based on leaked internal documents. (In response, Meta said it had asked the board for input on Cross-Check and would “strive to be clearer in our explanations to them going forward.”)

Calls for more transparency cropped up again in the latest round of Oversight Board decisions.

Oversight Board’s Thursday rulings

In one of the two cases, the board overturned Meta’s decision to remove an Instagram post discussing ayahuasca, a psychedelic drink that has long been used for indigenous healing and spiritual rituals, after a review by its automated system and a human moderator. Meta told the Oversight Board that the post was removed because it encouraged the use of a non-medical drug. But the Board said the post did not actually violate Instagram’s Community Guidelines at the time, which only prohibited the sale and purchase of drugs. It also took issue with the company’s claim that the post could harm public health, saying the post discussed the use of ayahuasca in a religious context and did not include information about how to get or use the substance.

But much of the board’s criticism in the case centered on the fact that Meta did not tell the user who made the post which of its rules they broke.

“The Board is concerned that the company continues to apply Facebook’s Community Standards on Instagram without transparently telling users it is doing so,” the board said in a statement. “The Board does not understand why Meta cannot immediately update the language in Instagram’s Community Guidelines to tell users this.”

The board ordered Instagram to restore the post. And in its recommendations based on the case, the board said Meta should “explain to users precisely what rule in the content policy they have violated,” when their content is acted upon. It also encouraged the company to update Instagram’s Community Guidelines to be consistent with the Community Standards on Facebook within 90 days.

In response to the board’s decision, Facebook said it had reinstated the post and would conduct a review of similar content. It also said that it would review the board’s policy recommendations.

“We welcome the Oversight Board’s decision today on this case,” the company said in a statement.

The second case regarded an August 2021 Facebook post of a piece of North American Indigenous art meant to raise awareness about the discovery of unmarked graves at a former residential school for Indigenous children. In the post, the user noted the name of the artwork, “Kill the Indian/ Save the Man,” and described images in the work as, “Theft of the Innocent, Evil Posing as Saviours, Residential School / Concentration Camp, Waiting for Discovery, Bring Our Children Home.” Facebook’s automated systems identified the post as potentially violating its hate speech policy and a human reviewer removed it the day after it was posted; when the user appealed the decision, a second human reviewer affirmed the decision to remove.

When the Oversight Board selected the case for review, Meta identified the post’s removal as an “enforcement error” and restored it on August 27, according to the board’s Thursday statement. However, Meta did not notify the user that their post had been restored until a month later, after the board asked about the company’s messaging to the user, an issue Meta blamed on human error, the board said.

Based on the case, the board recommended that Meta, “provide users with timely and accurate notice of any company action being taken on the content their appeal relates to.” It added that “in enforcement error cases like this one, the notice to the user should acknowledge that the action was a result of the Oversight Board’s review process.”

Meta said in a statement that no action would be needed based on the board’s decision in this case because it had already reinstated the post, and said it would review the board’s policy recommendations.

In its second transparency report, also released Thursday, the Oversight Board noted the two decisions and said it would be “monitoring whether and how the company lives up to its promises” as it responds to the board’s recommendations. It also announced plans to release an annual report next year to assess the company’s performance in implementing the board’s decisions and recommendations.

“Over time, we believe that the combined impact of our recommendations will push Meta to be more transparent and benefit users,” the board said in the report.

The-CNN-Wire
™ & © 2022 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.

Article Topic Follows: CNN-Technology

Jump to comments ↓

Author Profile Photo

CNN

BE PART OF THE CONVERSATION

KVIA ABC 7 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content