The ADL says Wikipedia contains antisemitic bias, amid dispute over how the Israel-Hamas conflict is represented on the site
By Clare Duffy, CNN
New York (CNN) — The Anti-Defamation League has found evidence of antisemitic and anti-Israel bias on Wikipedia, it said in a Tuesday report, marking the latest rift between the advocacy group and the world’s largest online encyclopedia over how the Israeli-Palestinian conflict should be portrayed on the site.
The ADL claims it identified a network of Wikipedia editors – volunteer moderators for the popular information websites – who appear to have coordinated to “circumvent Wikipedia’s policies to introduce antisemitic narratives, anti-Israel bias, and misleading information” in Israeli and Palestinian-related entries. The report also accuses Wikipedia of failing to enforce its neutrality policy.
But Wikipedia said the ADL’s report includes “unsupported and problematic claims,” according to a statement from a spokesperson for the Wikimedia Foundation, which manages the site.
“The Foundation takes seriously allegations of bias on Wikipedia,” the spokesperson said. “We categorically condemn antisemitism and all forms of hate. It is unfortunate that we were not asked by the report’s authors to provide information and context that might have helped allay some of the concerns raised.”
The report from the Jewish civil rights group comes amid broader tension among users and observers of the online encyclopedia over how the Israel-Hamas war should be covered on the site, with editors quarrelling over how to describe events related to the conflict. Wikipedia says it has already been working to fend off bad actors accused of manipulating the platform.
The dispute
Last year, Wikipedia editors labeled the ADL as “generally unreliable regarding the Israeli-Palestinian conflict” because of its dual role as an advocacy and research organization, but “generally reliable” on other topics. The ADL said at the time that the decision was “a sad development for research and education” and that it would prevent information on antisemitism from reaching the public.
Since that decision, “the ADL has continued to misrepresent Wikipedia’s well-established guidelines, policies, and enforcement mechanisms that effectively address the issues outlined in the report and its recommendations,” the Wikimedia Foundation said in its statement.
Separately, Wikipedia also confirmed that in January, it blocked eight editors from contributing to content related to the Israeli-Palestinian conflict over concerns about bad-faith edits, a move the ADL praised. Wikipedia’s Arbitration Committee, a group of volunteers elected by other editors to manage conduct on the site, also placed articles strictly about the conflict in “extended confirmed protection,” meaning they can only be edited by experienced volunteers.
It’s not the first time there have been dissenting opinions on how contentious issues should appear on the internet’s encyclopedia. Similar disagreements have flared over descriptions of Hong Kong’s relationship to China and the January 6, 2021, attack on the US Capitol. Elon Musk also lashed out at the site earlier this year, calling it “Wokepedia” for its DEI budget and accusing it of political bias. Wikipedia co-founder Jimmy Wales responded that Musk must be “unhappy that Wikipedia is not for sale.”
The disputes align with concerns that have surfaced in recent years over the types of content and language that social media platforms do and don’t allow. But unlike many of those sites, Wikipedia is far more decentralized, relying on volunteer editors to create and edit articles while shaping and upholding its rules and norms, although the platform says it uses some automated moderation to police certain kinds of abuse.
“Wikipedia is written and managed by the people who show up,” said Loren Terveen, a computer science professor at the University of Minnesota who has studied Wikipedia and other online communities. “(Disputes) happen all the time … on various topics,” he said.
But the ADL says it worries antisemitic bias on the platform could be dangerous.
“We urge Wikipedia and policymakers to act quickly before rampant disinformation on one of the most visited sources of information leads to tragic consequences,” ADL CEO Jonathan Greenblatt said in a statement regarding Tuesday’s report.
The report
According to the report, the ADL identified a group of 30 Wikipedia editors that it claims coordinated to change pages related to “Israel, Palestine and the Israeli-Palestinian conflict.” It accuses the editors of “downplaying Palestinian antisemitism, violence and calls to destroy Israel while foregrounding criticism of Israel.” The editors also removed citations linking to reputable sources on those topics, such as academic research and mainstream media coverage, it said.
The ADL’s analysis compared three groups of 30 editors to the 30 editors who they say acted in “bad faith.” One group was comprised of active editors across all Wikipedia pages. Another focused on the topic of US-China relations, and the third engaged on the Israel-Hamas War page, but did not violate Wikipedia’s rules. The report found that the “bad faith” editors were far more active than the other three groups. Wikipedia says it has hundreds of thousands of volunteer editors.
The 30 editors collectively made changes to 10,000 articles related to Israel and Palestinians, with edits ramping up since Hamas attacked Israel on October 7, 2023, according to the report.
For example, the report states that a “suspicious editor” removed references to mainstream media coverage about calls for the destruction of Israel from a page called “Palestinian political violence.” It also states that a media report referencing sexual violence perpetrated by Hamas during its October 7 attacks was removed from the main page about the group.
The ADL wrote that those and other edits suggest a “systematic effort to skew numerous Wikipedia entries to promote a set of narratives critical of Israel” and to “shift the entries from balanced historical accounts” to a slanted narrative.
The report also claims Arabic Wikipedia pages related to Hamas “glorify” the group, which is designated by the United States as a foreign terrorist organization, and “perpetuate pro-Hamas propaganda.” For example, the Arabic-language Wikipedia page for Hamas refers to the group’s “goals as ‘liberation’ of ‘Palestine’” and refers to suicide bombers as “martyrs,” according to the report, which says the site is failing to enforce its rules consistently across non-English language content.
Wikipedia’s policies say its editors should use a “neutral point of view,” meaning they should publish content that “almost everyone agrees about” but also make clear where there are points of contention. Editors who make major changes to articles should discuss them with others on the forum and provide a rationale first, the site says.
“The thing that you have to realize is that people are interpreting ‘neutral point of view,’” Terveen said. “And people are going to argue about that.”
In its statement, the nonprofit Wikimedia Foundation said the site has “multiple governance processes” to “prioritize a fair and balanced approach to contentious topics.”
“Wikipedia’s well-documented standards are specifically designed to prevent undue influence and preserve its independent, nonprofit model,” it said. “Wikipedia volunteers have a strong track record of successfully managing neutrality on contentious subjects.”
As part of a series of recommendations introduced in the report, the ADL is calling on the Wikimedia Foundation to develop a program that vets experts on Israel and the Israeli-Palestinian conflict to review contentious pages for accuracy and bias. It also wants Wikipedia to evaluate the effectiveness of its human review with its automated tools to prevent abuse and manipulation. The Wikimedia Foundation said many of the ADL’s suggestions are already in place on the site.
Terveen noted that it can be challenging for a site that relies on crowdsourced entries to reach a consensus view on controversial topics.
“The big problem is scale. Is there enough human effort to make sure things are done well? And that’s just an ongoing challenge,” Terveen said. “Wikipedia does a good job, but it doesn’t do a perfect job. And, I think, in some sense, it can’t do a perfect job.”
The-CNN-Wire
™ & © 2025 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.