Zuckerberg Scales Back Content Moderation—What Happens Next Is Up to Us

Zuckerberg Scales Back Content Moderation—What Happens Next Is Up to Us
Photo by Ben Hershey / Unsplash

Meta's announced changes to its content moderation policies deeply dissappointed us. Mallory has dedicated decades of her career to working in human rights organizations, helping to create the critical policies Meta is now casting aside. I spent the last five years of my career at Mozilla Foundation, advocating for platform accountability. In our feature this week, we explore how we think various actors can seize this moment to collectively focus on building alternatives that truly serve the communities we've both hoped would thrive online.

But first...

Lock And Code

Mallory was featured on the Lock and Code podcast two days ago, discussing the evolving landscape of AI and encrypted messaging, including the challenges and opportunities for privacy and security. Listen to it on Apple PodcastsSpotify, or whatever preferred podcast platform you use.

What else are people saying about Meta's content moderation changes?


Researchers, fact-checkers, and trust and safety experts are expressing concern

Although this change is officially limited to Meta's U.S. policies, it is expected to have global implications.

And users are shifting to new platforms that align with their values

In other news...

Careers and Funding Opportunities

What did we miss? Please send us a reply or write to editor@exchangepoint.tech.


A Call To Build Better Social Media Spaces

By Mallory Knodel and Audrey Hingle

Mark Zuckerberg’s recent decision to scale back content moderation efforts marks a sharp departure from over a decade of incremental progress driven by human rights organizations the world over to craft content moderation rules that protected the most marginalized global communities, while also protecting free speech. 

Mallory has firsthand experience working with organizations like the Association for Progressive Communicaitons, ARTICLE 19, Dangerous Speech Project, Global Network Initiative and Center for Democracy and Technology who have tirelessly advanced nuanced moderation frameworks, proving the power of collaboration between civil society and tech platforms. Scaling them back risks undoing years of civil society progress at a time when misinformation and online harms continue to rise, and when we’ve seen more funding than ever poured into academic- and civil society-led research and advocacy to improve social spaces online.

Meta’s decision mirrors a troubling trend. X (formerly Twitter) has remade its brand on the deliberate rejection of fact-checking, content moderation and the API-integrated platform governance mechanisms that researchers and human rights groups fought to establish. Both Meta and X have moved towards community notes and annotation models as alternatives to traditional content moderation systems. 

Research suggests that both approaches have value, professional fact-checking does have a positive influence on correcting misinformation, and aggregated judgements from diverse groups of individuals are effective in distinguishing between true and false information. They could complement each other, combining the scalability of crowd-based approaches with the rigor of expert-led fact-checking. However, dismantling systems developed collaboratively with civil society—before robust replacements are tested—risks leaving users to fill the gaps with unpaid labour.

And that’s really the crux of the issue. Users— journalists, activists, celebrities, public agencies— are already producing the content that fuels these multi billion-dollar companies. Now those same users are being asked to do the work of policing that content, too, taking on the roles of moderators, curators and promoters without compensation or recognition.

This moment represents an opportunity to rethink our relationship with social media. If we are the ones creating the content, culture and connections that make these platforms valuable, why not direct our joy, creativity, reporting, care, our time and our energy into building social spaces that are ours?

Let’s get started. 

Human rights groups

Experiment with Fediverse technology to develop, implement and improve trust and safety measures that prioritise user-wellbeing. Platforms like Ghost, Mastodon, and others in the Fediverse ecosystem, allow for decentralized content moderation approaches, enabling human rights organizations to play a role in crafting policies that protect marginalized communities and amplify diverse voices. 

Journalists

Join the Fediverse, and create your own spaces for storytelling, reporting and engagement. Federated platforms like Mastodon allow publishers to integrate emerging tools and to move to different “instances” without losing your audience. You can even host your own, and have full control over the content and moderation policies. The best part is that with open, federated social media platforms you can stay connected to your friends, colleagues and other accounts that you still want to follow, but from a platform that shares your values and is governed accordingly.

Artists and Creatives

Start using federated platforms to share your work, collaborate and create new cultural hubs. Help set healthy aesthetics and cultural trends in these new spaces. New platforms like Pixelfed are stable and looking for early adopters!

Technologists and Open-Source Developers:

Seek or create job opportunities contributing to federated platforms and open-source projects. Many initiatives in the fediverse ecosystem are seeking developers to build tools that improve accessibility, trust and safety, governance systems, and scalability.

Federated Social Media: 

Recognize that your users are your partners, collaborators and greatest assets. Together you can seize this moment to redefine social media as a space for community and belonging. Work with them to articulate a future where social media truly serves its users, not corporate or political agendas. Work to interoperate with other platforms!

Advocacy Organizations

Continue the tireless work of holding platforms accountable for harm, and advocate for legislation and regulation that protects decentralized platforms from monopolistic practices and ensures equitable access to digital infrastructure. Legislative proposals can count on consensus-based technical protocols to support consumers and break up tech monopolies in social media.

Trust and Safety Experts and Researcher Community

In a time when trust and safety work is often misunderstood or demonized, we hope these experts will continue their vital efforts and bring their expertise to the Fediverse. By shaping community-driven moderation and governance, they can help create spaces where all users feel valued and protected. The Fediverse means interoperability between platforms but also a robust space for innovation, integration and intermediary services like community content moderation and fact checking.

Funders

Step up to fund and support federated social media initiatives. Provide the financial resources necessary to improve governance-by-design and develop innovative solutions for content moderation and trust and safety. This is a critical moment to empower noncorporate alternatives that prioritize the public good. By funding federated infrastructure improvements, research, and collaborative projects, you can help catalyze a shift toward platforms that center users and their rights. Free Our Feeds is one initiative you could get involved with which aims to create a decentralized social media ecosystem that operates in the public interest, free from the control of billionaires and private corporations.

The future of social media depends on reclaiming the power from corporate interests and investing in community-driven, decentralized platforms that prioritize equity, trust, and the well-being of their users.

💡
Please forward and share!

Subscribe to Internet Exchange

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe