Last month, the American non-profit organisation behind Wikipedia issued draft guidelines for researchers studying how neutral Wikipedia really is. But instead of supporting open inquiry, the guidelines reveal just how unaware the Wikimedia Foundation is of its own influence.
These new rules tell researchers – some based in universities, some at non-profit organisations or elsewhere – not just how to study Wikipedia’s neutrality, but what they should study and how to interpret their results. That’s a worrying move.
As someone who has researched Wikipedia for more than 15 years – and served on the Wikimedia Foundation’s own Advisory Board before that – I’m concerned these guidelines could discourage truly independent research into one of the world’s most powerful repositories of knowledge.
Telling researchers what to do
The new guidelines come at a time when Wikipedia is under pressure.
Tech billionaire Elon Musk, who was until recently also a senior adviser to US President Donald Trump, has repeatedly accused Wikipedia of being biased against American conservatives. On X (formerly Twitter), he told users to “stop donating to Wokepedia”.
In another case, a conservative think tank in the United States was caught planning to “target” Wikipedia volunteers it claimed were pushing antisemitic content.
Until now, the Wikimedia Foundation has mostly avoided interfering in how people research or write about the platform. It has limited its guidance to issues such as privacy and ethics, and has stayed out of the editorial decisions made by Wikipedia’s global community of volunteers.
But that’s changing.
In March this year, the foundation established a working group to standardise Wikipedia’s famous “neutral point of view” policies across all 342 versions in different languages. And now the foundation has chosen to involve itself directly in research.
Its “guidance” directly instructs researchers on both how to carry out neutrality research and how to interpret it. It also defines what it believes are open and closed research questions for people studying Wikipedia.
In universities, researchers are already guided by rules set by their institutions and fields. So why do the new guidelines matter?
Because the Wikimedia Foundation has lots of control over research on Wikipedia. It decides who it will work with, who gets funding, whose work to promote, and who gets access to internal data. That means it can quietly influence which research gets done – and which doesn’t.
Now the foundation is setting the terms for how neutrality should be studied.
What’s not neutral about the new guidelines
The guidelines fall short in at least three ways.
1. They assume Wikipedia’s definition of neutrality is the only valid one. The rules of English Wikipedia say neutrality can be achieved when an article fairly and proportionally represents all significant viewpoints published by reliable sources.
But researchers such as Nathaniel Tkacz have shown this idea isn’t perfect or universal. There are always different ways to represent a topic. What constitutes a “reliable source”, for example, is often up for debate. So too is what constitutes consensus in those sources.
2. They treat ongoing debates about neutrality as settled. The guidelines say some factors – such as which language Wikipedia is written in, or the type of article – are the main things shaping neutrality. They even claim Wikipedia gets more neutral over time.
But this view of steady improvement doesn’t hold up. Articles can become less neutral, especially when they become the focus of political fights or coordinated attacks. For example, the Gamergate controversy and nationalist editing have both created serious problems with neutrality.
The guidelines also leave out important factors such as politics, culture, and state influence.
3. They restrict where researchers should direct their research. The guidelines say researchers must share results with the Wikipedia community and “communicate in ways that strengthen Wikipedia”. Any criticism should come with suggestions for improvement.
That’s a narrow view of what research should be. In our wikihistories project, for example, we focus on educating the public about bias in the Australian context. We support editors who want to improve the site, but we believe researchers should be free to share their findings with the public, even if they are uncomfortable.
Neutrality is in the spotlight
Most of Wikipedia’s critics aren’t pushing for better neutrality. They just don’t like what Wikipedia says.
The reason Wikipedia has become a target is because it is so powerful. Its content shapes search engines, AI chatbot answers, and educational materials.
The Wikimedia Foundation may see independent and critical research as a threat. But in fact, this research is an important part of keeping Wikipedia honest and effective.
Critical research can show where Wikipedians strive to be neutral but don’t quite succeed. It doesn’t require de-funding Wikipedia or hunting down its editors. It doesn’t mean there aren’t better and worse ways of representing reality.
Nor does it mean we should discard objectivity or neutrality as ideals. Instead, it means understanding that neutrality isn’t automatic or perfect.
Neutrality is something to be worked towards. That work should involve more transparency and self-awareness, not less – and it must leave space for independent voices.
Heather Ford receives funding from the Australian Research Council. She was previously a member of the Wikimedia Foundation Advisory Board.