Skip to Content

Facebook sued for $150 billion over violence against Rohingya in Myanmar

By Michelle Toh, CNN Business

Rohingya refugees are suing Facebook over its own admitted failure to stop the spread of hate speech that contributed to violence in Myanmar.

This week, law firms in the United States and United Kingdom have launched a legal campaign against Meta, Facebook’s parent company, over allegations that executives knew of anti-Rohingya posts, groups and accounts on the social network, and did little to curb them, they said in a statement.

According to a website set up for the campaign, the UK legal claim will be on behalf of those who live anywhere outside the United States, while the American claim will represent those residing stateside.

Altogether, the attorneys represent “Rohingya people around the world, including those living in refugee camps in Bangladesh,” states the website.

US law firm Edelson said on Twitter it had filed a proposed class action lawsuit against Meta in California. A copy of the complaint reviewed by CNN Business shows that plaintiffs are seeking more than $150 billion in compensatory damages, in addition to punitive damages to be determined in court.

In a letter addressed to Facebook’s London office on Monday, McCue Jury & Partners said that it had coordinated with partners in the United States to kick off a “trans-Atlantic legal campaign to seek justice for the Rohingya people.”

“Our clients intend to bring proceedings against [Facebook] UK in the High Court for its acts and omissions that contributed to the serious, sometimes fatal, harm suffered by our clients and their family members,” the law firm wrote in the letter, which was posted on the campaign website.

“Claimants in both cases will seek to remain anonymous for fear of reprisal,” Mishcon de Reya, one of the British law firms handling the UK complaint, said.

This week’s legal claims accuse Facebook of using algorithms “that amplified hate speech against the Rohingya people on its platform,” as well as failing “to take down specific posts inciting violence against or containing hate speech directed towards the Rohingya people,” Mishcon de Reya wrote in a statement.

Facebook also allegedly “failed to close specific accounts or delete specific groups or pages, which were being used to propagate hate speech and/or incite violence,” the statement said.

Meta declined to comment on Tuesday.

The US suit -— which was the first to be filed — would have to clear numerous hurdles just to make it to summary judgment or trial, let alone secure a favorable ruling, according to Josh Davis, a professor at the University of San Francisco School of Law with expertise in class action lawsuits and complex litigation.

For a suit to be certified as a class action by a judge, plaintiffs involved must have experienced predominantly “common” issues. But given the nature of the crisis in Myanmar, the experiences of potential class members could vary widely and “it’s hard to imagine proof that would be common to the class that would establish that Facebook’s conduct harmed individual class members,” Davis said.

The legal argument in the US case may also be tricky. It alleges that Facebook should face product liability and negligence claims for failing to address defects in its platform which plaintiffs claim contributed to anti-Rohingya violence, court documents show. In the United States, Facebook would typically be protected from such liability by Section 230 of the Communications Decency Act, but the suit asks the court to instead apply Burmese law, which it says does not provide such protections.

Davis said American courts are typically reluctant to take on such cases. He added that proving Facebook’s actions caused the harms to the Rohingya people may be difficult.

“From a legal perspective, it’s going to be [a] really challenging [case] to bring,” Davis said.

The Rohingya are a stateless Muslim minority in Myanmar’s Rakhine State, thought to number about 1 million people. Myanmar does not count them as citizens, nor as belonging to one of the recognized ethnic groups in the country.

In 2016 and 2017, the military launched a brutal campaign of killing and arson that forced more than 740,000 Rohingya minority people to flee into neighboring Bangladesh, prompting a genocide case that was heard at the International Court of Justice.

In 2019, the United Nations said “grave human rights abuses” by the military were still continuing in the ethnic states of Rakhine, Chin, Shan, Kachin and Karen. Survivors have recounted harrowing atrocities including gang rape, mass killings, torture and widespread destruction of property at the hands of the army.

A UN fact-finding commission has called the violence a “textbook example of ethnic cleansing.” In 2018, the US House of Representatives made the same declaration, and this year US President Joe Biden’s administration has been reviewing whether to make that designation.

The US complaint mentions allegations made by Frances Haugen, the former Facebook employee who recently came forward as a whistleblower on the company’s practices.

Haugen has said that “Facebook executives were fully aware that posts ordering hits by the Myanmar government on the minority Muslim Rohingya were spreading wildly on Facebook,” and that “the issue of the Rohingya being targeted on Facebook was well known inside the company for years,” according to the suit.

Facebook CEO Mark Zuckerberg issued a 1,300-word statement in response to Haugen’s claims, which extended well beyond Myanmar. In it, he said that a “false picture of the company” was being painted.

“If we didn’t care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space — even ones larger than us?” he wrote at the time.

But Myanmar has become a case study in the deadly impact that hate speech shared on Facebook can have.

In 2018, a senior UN official addressed the Myanmar crisis, saying it bore “the hallmarks of genocide.” By promoting violence and hatred against the Rohingya population, the UN official said Facebook had “turned into a beast.”

The company later acknowledged that it hadn’t done enough to prevent its platform from being used to fuel bloodshed, and Zuckerberg apologized after an open letter from activists and promised to increase moderation efforts.

Still, Facebook’s previous admissions won’t necessarily bolster the arguments made in these new lawsuits. “To say that they should have done more doesn’t mean that they violated anyone’s legal rights or that anyone can establish that what Facebook did caused their injury,” Davis said.

Now, lawyers handling the complaints say that Facebook has also “failed in its policy and in practice to invest sufficiently in content moderators who spoke Burmese or Rohingya or local fact checkers.”

Facebook executives previously said in October that the company had “hired more people with language, country and topic expertise” in countries like Myanmar over the last two years and had added content moderators in 12 new languages this year.

“Adding more language expertise has been a key focus area for us,” they wrote in an October blog post.

Facebook’s issues with foreign languages extend to other volatile countries, such as Ethiopia and Afghanistan.

— Clare Duffy, Helen Regan, Eliza Mackintosh and Rishi Iyengar contributed to this report.

The-CNN-Wire
™ & © 2021 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.

Article Topic Follows: CNN - Social Media/Technology

Jump to comments ↓

Author Profile Photo

CNN Newsource

BE PART OF THE CONVERSATION

KVIA ABC 7 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content