If It Breaks Wikipedia, It’s Probably Bad Policy

One Simple Test to Try Before Regulating the Internet.

If It Breaks Wikipedia, It’s Probably Bad Policy
Photo by Igor Shalyminov / Unsplash

By Audrey Hingle

Does bad internet policy have a tell? In 2023 IX's Mallory Knodel proposed what she called the “Wikipedia Test.” It was a straightforward principle: if your regulation makes it harder for Wikipedia to exist, it's probably not good regulation. It challenged policymakers to consider the unintended consequences of laws aimed at commercial platforms. Because Wikipedia, like many open, community-led projects, represents the best of the internet: collaborative, transparent, and built for the public good.

Now, that idea has been formalized. The Wikimedia Foundation has published The Wikipedia Test: a practical tool for assessing whether proposed laws protect or threaten the digital commons by evaluating their impact on open, community-led projects like Wikipedia.

When Good Intentions Break Good Platforms

Here’s the problem: Many well-intentioned laws aimed at regulating commercial platforms can unintentionally threaten nonprofit, volunteer-driven spaces. These are platforms that don’t collect personal data, don’t sell ads, and don’t algorithmically amplify content for profit. But they still risk being swept up in laws that fail to distinguish between the commercial and the public-interest web.Take the UK’s Online Safety Act. It was designed to regulate major commercial platforms like Facebook and TikTok, but its sweeping scope could apply to Wikipedia, triggering requirements like user identity verification and moderation tools. Or look at efforts to weaken Section 230 protections in the United States. This could expose Wikipedia and other open projects to legal liability for user contributions, threatening the core models that enable community-led editing and moderation.

The Wikipedia Test centers around a simple question: Would this law make it harder for people to access, contribute to, or trust a platform like Wikipedia?

When we say “Wikipedia,” we don’t just mean Wikipedia. The test treats it as a stand-in for a broader set of online spaces that share core values: openness, privacy, community governance, and the free exchange of knowledge. That includes projects like Project Gutenberg, OpenStreetMap, FixMyStreet, and countless open data repositories and scientific archives. These projects are critical to education, civic engagement, and innovation, and they often rely on legal protections and platform design choices that don’t fit commercial models.

The Wikipedia Test Rubric

To make this idea concrete, the Wikimedia Foundation has developed a short rubric that identifies common red flags in proposed legislation. It asks: Could the policy...

  1. Increase the legal risks or costs of hosting community-led public interest projects like Wikipedia?
  2. Make it harder to access or share information, including works that are freely licensed, protected by copyright, or in the public domain?
  3. Threaten user privacy by requiring the collection of sensitive, identifiable information such as ages, real names, or contact details of Wikipedia’s volunteer editors and readers?
  4. Enable surveillance that discourages people from reading or editing Wikipedia?
  5. Increase the risk for people who contribute to or access Wikipedia by enabling governments to collect identifying information, leading to intimidation or retaliation?
  6. Undermine the ability of volunteer editors to govern Wikipedia’s content and community guidelines?
  7. Restrict the free flow of information across borders, limiting access to Wikipedia and its content?

Each of these questions is rooted in real-world policy examples, from mass surveillance programs that chilled Wikipedia readership, to content moderation laws that ignore nonprofit models, to international treaties that enable cross-border data access by authoritarian governments.

The Wikipedia Test isn’t a pass/fail tool. It’s a starting point for dialogue,  a way to surface harms early, prompt better questions, and support laws that protect the internet’s role as a public good.

📘 Read the full Wikipedia Test, see the criteria, and explore case studies: https://wikimediafoundation.org/news/2025/06/27/the-wikipedia-test


Summer of Convenings: Public Interest Tech Takes the Stage

It’s summer, and that means it’s convening season. This week was a big one, with major gatherings bringing together thinkers and doers across internet governance and responsible technology. IX’s own Mallory Knodel joined the “Tech for Society” panel at The Tech People Want Online Summit, alongside speakers from Xnet, the Austrian Institute of Technology, and others. The session explored how digital tools can help bridge divides, foster collective action, and support reconciliation efforts at a human scale. Meanwhile, the UN’s AI for Good Summit and the WSIS+20 Forum kept the global internet governance calendar packed.

Did you speak on a panel or attend a session that sparked a fresh idea? We’d love to help you turn it into an article to share with the IX community. Get in touch with us editor@exchangepoint.tech.


Support the Internet Exchange

If you find our emails useful, consider becoming a paid subscriber! You'll get access to our members-only Signal community where we share ideas, discuss upcoming topics, and exchange links. Paid subscribers can also leave comments on posts and enjoy a warm, fuzzy feeling.

Not ready for a long-term commitment? You can always leave us a tip.

Become A Paid Subscriber

From the Group Chat 👥 💬

Jack Dorsey has launched a new messaging app called Bitchat, a Bluetooth-based, off-grid alternative to WhatsApp that stores no data, requires no accounts, and works without the internet. It is being pitched as a privacy-first, censorship-resistant tool for peer-to-peer communication. CNBC covered the launch with uncritical enthusiasm, but security researchers quickly found serious issues. The app's identity system is mostly cosmetic, leaving it open to basic man-in-the-middle attacks.

And with a range that maxes out in your local coffee shop, is this really a censorship workaround? 

In an era when people can vibe code a "secure" app in a weekend, maybe the press could wait until someone checks if it actually works before writing it up like the future of communication?


Internet Governance

Digital Rights

Technology for Society

Privacy and Security

Upcoming Events

Careers and Funding Opportunities

Opportunities to Get Involved

What did we miss? Please send us a reply or write to editor@exchangepoint.tech.

💡
Want to see some of our week's links in advance? Follow us on Mastodon, Bluesky or LinkedIn, and don't forget to forward and share!

Subscribe to Internet Exchange

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe