Musk’s X is ‘go-to platform’ for antisemitism, study finds
By Hadas Gold, CNN
(CNN) — Elon Musk’s X is the “go-to platform for antisemitic posters,” according to a new year-long study shared exclusively with CNN.
The research, by the Center for Countering Digital Hate (CCDH) and the Jewish Council for Public Affairs, found that not only is antisemitism rampant on the platform, but also that the crowdsourced community notes are failing to effectively moderate posts — with just over 1% of the most-viewed antisemitic posts the group studied getting a community note fact-check.
Accounts with many followers that post antisemitic content, whom the study dub as antisemitic influencers, are also gaining traction on X and reaching millions, the research found. Some of the most popular antisemitic influencers identified by the study paid for verification badges on X and are permitted to sell subscriptions to their accounts on X, meaning they can profit from hate.
“The truth is that antisemitism has been present on X long before it was rebranded (from Twitter). It always had a problem with it,” CCDH CEO Imran Ahmed told CNN in an interview. “But to see it being tolerated, monetized and amplified so openly is still shocking.”
With the help of OpenAI’s GPT-4o model, the study identified over 679,000 posts containing antisemitic remarks between February 1, 2024, and January 31, 2025. (Researchers personally checked 5,000 of the posts and said the GPT-4o model had a high accuracy rate.) The research found that 59% of those posts espoused conspiracy theories about Jews, such as that Jewish people control governments, that Jews are satanic in nature, or that the Holocaust never occurred or misrepresentations of what happened during the Holocaust. The other 41% of the posts spouted antisemitic abuse, such as dehumanization of Jewish people or attacks on their character.
X’s policies prohibit attacking “other people on the basis of race, ethnicity, national origin, caste, sexual orientation, gender, gender identity, religious affiliation, age, disability, or serious disease” on the platform. It also prohibits “inciting behavior,” intended to “degrade or reinforce” stereotypes, including dehumanizing a group of people or using slurs. X also explicitly prohibits the denial of mass casualty events such as the Holocaust.
If a post violates these policies, X says it will restrict the reach of such content, and sometimes require accounts to remove the content before they can post again, or delete the content on its own.
At the same time, Musk has allowed conspiracy theorists and antisemitic posters back on the platform on free speech grounds. For example, white supremacist and Holocaust denier Nick Fuentes’ account was reinstated last year after being suspended in 2023.
“It is better to have anti whatever out in the open to be rebutted than grow simmering in the darkness,” Musk posted on X after Fuentes’ account was reinstated.
The new year-long study found that X’s content restrictions are not having much effect on antisemitic content. The more than 679,000 antisemitic posts identified were viewed 193 million times in total, the study said.
X did not respond to CNN’s request for comment.
Barely existent community notes
In January 2024, after coming under fire for reposting an antisemitic conspiracy theory on X, Musk visited Nazi concentration camps in Europe. During an appearance there alongside right-wing podcaster Ben Shapiro, he said that X always favors “free speech” and that if “somebody says something that is false,” users can reply with a correction.
Musk sang the praises of the platform’s community notes feature. Community notes are user-generated context labels; if enough verified users vote that a context note is valuable, it will appear under a post.
“We’ve put maximum resources and attention behind community notes, so if somebody tries to push a falsehood like Holocaust denial or something like that, they can immediately be corrected. And you can’t get rid of the tag, it’s like stuck on you,” Musk said at the time.
But the new study found that community notes are failing miserably. The notes appeared on just over 1% of the most-viewed antisemitic posts, the study identified, including just two of the most-viewed Holocaust denial posts. Community notes were posted so late that they appeared for an average of just 22% of a post’s viewership.
The study found X took or forced the poster to take action on only 36 of the 300 most-viewed antisemitic posts, such as limiting their reach or removing the post.
“Antisemitic conspiracy theories and hate that were once fringe have wholly normalized – thriving in plain sight and amplified by X’s failure to live up to its own policies,” said Amy Spitalnick, CEO of the Jewish Council for Public Affairs.
Antisemitic influencers
Ten “antisemitism influencers” accounted for 32% of all the posts identified as antisemitic in the study. One of the accounts has 2 million followers, while others have hundreds of thousands.
Six of the 10 influencers were “verified” at the time of the study, meaning they pay for X Premium, get a blue checkmark and get their posts boosted by the platform to be seen by more users.
Three of the 10 also had the ability to charge for subscriptions, meaning followers could pay for exclusive content. (One of the three influencers lost his subscription capabilities toward the end of the study’s period in December 2024.)
In an interview, Ahmed, the CEO of the CCDH, pointed out that when Musk bought Twitter in October 2022, he wrote in an open letter that he did not want the platform to become a “free-for-all-hellscape where anything can be said with no consequences.”
“I’m afraid that for Jews, he’s done precisely that … He’s turned X into a hellscape fantasy, and that’s just his own words,” Ahmed said.
X sued the CCDH in 2023 over its research, accusing the group of violating the company’s terms of service when it studied and then wrote about hate speech on the platform following Musk’s takeover. X said it suffered tens of millions of dollars in damages from CCDH’s publications. But a federal judge threw out the lawsuit in March 2024, excoriating the platform’s case as plainly punitive rather than about protecting the platform’s security and legal rights.
“X didn’t sue us for defamation. They sued us for the act of doing research,” Ahmed said. “People don’t sue you for the act of doing research unless there’s something they don’t want you to find.”
The-CNN-Wire
™ & © 2025 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.