Wikipedia, one of the world’s most visited websites and a key source for both the public and artificial intelligence systems, is at the center of a new congressional investigation into whether foreign governments, corporations, and paid operatives are secretly shaping its content. Wikipedia has an open editing model that makes its content a target of manipulation that goes undetected for long periods of time.
The House Committee on Oversight and Government Reform has launched an investigation into organized efforts to manipulate Wikipedia content in violation of the platform’s policies. Chairman James Comer of Kentucky and Subcommittee on Cybersecurity, Information Technology, and Government Innovation Chairwoman Nancy Mace of South Carolina have requested documents and information from Wikimedia Foundation Chief Executive Officer Maryana Iskander concerning the foundation’s response to coordinated editing campaigns.
Conservative politicians, political figures, and organizations, even in Alaska, have found their profiles subject to repeated political editing attacks, requiring constant monitoring.
The inquiry follows reports that Wikipedia articles have been subject to manipulation by foreign governments, corporate interests, and individuals seeking to influence public opinion. Because Wikipedia is widely relied upon by the public and increasingly by artificial intelligence systems for training data, the Committee is examining how disinformation campaigns may distort access to credible information.
Wikipedia has a record of misconduct on its platform. In 2012, the site investigated Wiki-PR, uncovering more than 250 “sockpuppet” accounts that were used for paid editing, which resulted in bans. A year earlier, the UK-based public relations firm Bell Pottinger was found editing Wikipedia entries for its clients, with changes traced directly to company offices. More recently, in 2023, Wikipedia’s community newspaper The Signpost reported that India’s Adani Group allegedly used sockpuppet accounts and undeclared paid editors to reshape its pages, inserting favorable content and removing conflict-of-interest warnings.
Political and ideological disputes have spilled into Wikipedia. In 2023, historians Jan Grabowski and Shira Klein argued that a small group of editors pushed a distorted narrative on Polish-Jewish relations, influenced by nationalist propaganda. In response, other researchers, including Piotr Konieczny, challenged those findings, underscoring the contentious nature of editorial battles on sensitive historical topics.
Nation-states have also been linked to manipulation campaigns. In 2021, the Wikimedia Foundation banned seven accounts tied to Wikimedians of Mainland China after accusations of vote-stacking and doxing, raising concerns about possible state-backed infiltration. In 2025, the House Oversight Committee cited reports of pro-Kremlin and anti-Israel narratives being inserted into articles on conflicts involving Russia and Israel, framing the issue as a matter of national security.
Instances of self-promotion have further complicated Wikipedia’s credibility. In 2024, an editor operating under the name “Swmmng” created or modified 235 articles across projects to promote artist David Woodard, violating rules against conflict of interest and sockpuppetry. Earlier cases include reports in 2010 that IBM advocates edited the History of IBMarticle to soften references to the company’s ties to the Holocaust.
Wikipedia has also faced exposure to hoaxes and disinformation. In 2007, a false claim that television composer Ronnie Hazlehurst co-wrote a pop song was picked up by the British media before being debunked. In 2014, Russian actors planted a hoax about a chemical plant explosion, an incident that highlighted the risks of coordinated falsehoods spreading beyond the platform. In 2015, during the Gamergate controversy, Wikipedia’s Arbitration Committee intervened to ban editors engaged in manipulation of gender-related articles.
The Oversight Committee’s current investigation seeks to determine how frequently such incidents occur, what tools the Wikimedia Foundation has developed to prevent them, and how effectively it enforces accountability when organized campaigns target sensitive topics. Lawmakers intend to evaluate the platform’s ability to safeguard neutrality as it continues to shape public knowledge and influence emerging technologies.
House Oversight Committee finds FTC chair Lina Khan abused authority, politicized the agency
House Oversight Committee investigates Google, Meta over censorship of Trump assassination attempt
