Zuckerberg Scales Back Content Moderation—What Happens Next Is Up to Us
Meta's announced changes to its content moderation policies deeply dissappointed us. Mallory has dedicated decades of her career to working in human rights organizations, helping to create the critical policies Meta is now casting aside. I spent the last five years of my career at Mozilla Foundation, advocating for platform accountability. In our feature this week, we explore how we think various actors can seize this moment to collectively focus on building alternatives that truly serve the communities we've both hoped would thrive online.
But first...
Lock And Code
Mallory was featured on the Lock and Code podcast two days ago, discussing the evolving landscape of AI and encrypted messaging, including the challenges and opportunities for privacy and security. Listen to it on Apple Podcasts, Spotify, or whatever preferred podcast platform you use.
What else are people saying about Meta's content moderation changes?
Researchers, fact-checkers, and trust and safety experts are expressing concern
- Mark Zuckerberg’s fact-checking announcement is filled with bad-faith reasoning. https://www.niemanlab.org/2025/01/zuck-chucks-fact-checkers-to-cosplay-as-elon-musk/
- Meta’s planned shift away from third-party fact-checking on Facebook in favour of a crowdsourced approach has perplexed those who study the spread of misinformation. https://www.nature.com/articles/d41586-025-00027-0
- Fact checking program partners pen an open letter warning of a setback for accuracy online and potential global consequences https://www.poynter.org/ifcn/2025/an-open-letter-to-mark-zuckerberg-from-the-worlds-fact-checkers-nine-years-later/
- Meta’s announcement includes a change to its algorithm design, which is the least clear and likely the most impactful. https://www.techpolicy.press/to-evaluate-metas-shift-focus-on-the-product-changes-not-the-moderation/
Although this change is officially limited to Meta's U.S. policies, it is expected to have global implications.
- Meta Oversight Board members warn Zuckerberg’s shift to "community notes" undermins global content moderation and harms international users. https://www.prospectmagazine.co.uk/politics/free-speech/69005/mark-zuckerberg-is-playing-narrow-politics-with-trump
- Meta’s fact-checking partners around the world are disappointed — but not surprised — by Facebook’s move to do away with fact-checking by trained teams. https://restofworld.org/2025/meta-drops-fact-checking-partnerships-global-watchdogs-scramble/
- Britain’s new online safety laws are “not up for negotiation,” a minister warned, responding to Mark Zuckerberg’s pledge to join Trump in pressuring countries over content “censorship.” https://www.theguardian.com/technology/2025/jan/11/tech-giants-told-uk-online-safety-laws-not-up-for-negotiation
And users are shifting to new platforms that align with their values
- Mastodon CEO Eugen Rochko advocates for building social media on open protocols rather than proprietary platforms, emphasizing user autonomy and interoperability. https://techcrunch.com/podcast/social-media-should-be-built-on-protocols-not-platforms-says-mastodon-ceo-eugen-rochko/
- As disputes over content moderation grow and users gravitate toward communities that align with their values, Renée DiResta explores the implications of this trend and how the fragmentation of online communities could impact social cohesion. https://www.noemamag.com/the-great-decentralization/
In other news...
- Mastodon’s CEO and creator is handing control to a new nonprofit organization, Mastodon says the decentralized network ‘should not be owned or controlled by a single individual.’ Cool! https://www.theverge.com/2025/1/13/24342603/mastodon-non-profit-ownership-ceo-eugen-rochko
- Grassroots movements in East Asia are advancing open-source AI by bridging linguistic and cultural divides, promoting equity and innovation, and influencing policy discussions, as highlighted at COSCUP 2024. https://opensource.org/blog/open-source-ai-and-policy-from-the-perspective-of-east-asia
- NATO has launched "Baltic Sentry," a mission deploying naval drones, submarines, ships, and aircraft to protect undersea cables in the Baltic Sea from potential sabotage, particularly by Russia.https://ground.news/article/nato-launches-new-baltic-sea-mission-to-protect-undersea-cables_73bfa6
- FTC takes action against GoDaddy for alleged lax data security for its website hosting services. https://www.ftc.gov/news-events/news/press-releases/2025/01/ftc-takes-action-against-godaddy-alleged-lax-data-security-its-website-hosting-services
- New York Assemblymember Alex Bores is drafting the RAISE Act to regulate advanced AI models, drawing inspiration from California's unsuccessful SB 1047. https://www.technologyreview.com/2025/01/09/1109875/a-new-york-legislator-wants-to-pick-up-the-pieces-of-the-dead-california-ai-bill/
- Gravy’s data leak shows how vulnerable our location info is in the ad-tech world. https://www.wired.com/story/gravy-location-data-app-leak-rtb/
- Social media platforms, have perpetuated exploitative content reminiscent of "Girls Gone Wild," highlighting the ongoing challenges in moderating harmful material online.https://www.nytimes.com/2025/01/15/opinion/girls-gone-wild-meta-social-media.html
- The FBI’s warns cell phone users to stop texting and use end-to-end encrypted messaging and calls “wherever possible.” https://www.forbes.com/sites/zakdoffman/2025/01/10/fbis-iphone-android-warning-new-update-means-you-may-send-texts-again/
- At a recent UN Security Council meeting, governments called for spyware regulations to curb misuse and protect human rights, the first time this type of software has been discussed at the Security Council. Never too late to notice a problem! https://techcrunch.com/2025/01/15/governments-call-for-spyware-regulations-in-un-security-council-meeting/
- The Software Freedom Conservancy's Sustainer program ensures long-term support for software freedom through individual contributions, reducing reliance on corporate funding. Help them meet their match challenge! https://floss.social/@karen/113834656913106799
Careers and Funding Opportunities
- Measurement Lab is looking for a Director, Technical Lead to help drive delivery of Measurement Lab’s projects and programs in the open internet measurement ecosystem. https://www.measurementlab.net/jobs/2024-12/director-tech-lead/
What did we miss? Please send us a reply or write to editor@exchangepoint.tech.
A Call To Build Better Social Media Spaces
By Mallory Knodel and Audrey Hingle
Mark Zuckerberg’s recent decision to scale back content moderation efforts marks a sharp departure from over a decade of incremental progress driven by human rights organizations the world over to craft content moderation rules that protected the most marginalized global communities, while also protecting free speech.
Mallory has firsthand experience working with organizations like the Association for Progressive Communicaitons, ARTICLE 19, Dangerous Speech Project, Global Network Initiative and Center for Democracy and Technology who have tirelessly advanced nuanced moderation frameworks, proving the power of collaboration between civil society and tech platforms. Scaling them back risks undoing years of civil society progress at a time when misinformation and online harms continue to rise, and when we’ve seen more funding than ever poured into academic- and civil society-led research and advocacy to improve social spaces online.
Meta’s decision mirrors a troubling trend. X (formerly Twitter) has remade its brand on the deliberate rejection of fact-checking, content moderation and the API-integrated platform governance mechanisms that researchers and human rights groups fought to establish. Both Meta and X have moved towards community notes and annotation models as alternatives to traditional content moderation systems.
Research suggests that both approaches have value, professional fact-checking does have a positive influence on correcting misinformation, and aggregated judgements from diverse groups of individuals are effective in distinguishing between true and false information. They could complement each other, combining the scalability of crowd-based approaches with the rigor of expert-led fact-checking. However, dismantling systems developed collaboratively with civil society—before robust replacements are tested—risks leaving users to fill the gaps with unpaid labour.
And that’s really the crux of the issue. Users— journalists, activists, celebrities, public agencies— are already producing the content that fuels these multi billion-dollar companies. Now those same users are being asked to do the work of policing that content, too, taking on the roles of moderators, curators and promoters without compensation or recognition.
This moment represents an opportunity to rethink our relationship with social media. If we are the ones creating the content, culture and connections that make these platforms valuable, why not direct our joy, creativity, reporting, care, our time and our energy into building social spaces that are ours?
Let’s get started.
Human rights groups
Experiment with Fediverse technology to develop, implement and improve trust and safety measures that prioritise user-wellbeing. Platforms like Ghost, Mastodon, and others in the Fediverse ecosystem, allow for decentralized content moderation approaches, enabling human rights organizations to play a role in crafting policies that protect marginalized communities and amplify diverse voices.
Journalists
Join the Fediverse, and create your own spaces for storytelling, reporting and engagement. Federated platforms like Mastodon allow publishers to integrate emerging tools and to move to different “instances” without losing your audience. You can even host your own, and have full control over the content and moderation policies. The best part is that with open, federated social media platforms you can stay connected to your friends, colleagues and other accounts that you still want to follow, but from a platform that shares your values and is governed accordingly.
Artists and Creatives
Start using federated platforms to share your work, collaborate and create new cultural hubs. Help set healthy aesthetics and cultural trends in these new spaces. New platforms like Pixelfed are stable and looking for early adopters!
Technologists and Open-Source Developers:
Seek or create job opportunities contributing to federated platforms and open-source projects. Many initiatives in the fediverse ecosystem are seeking developers to build tools that improve accessibility, trust and safety, governance systems, and scalability.
Federated Social Media:
Recognize that your users are your partners, collaborators and greatest assets. Together you can seize this moment to redefine social media as a space for community and belonging. Work with them to articulate a future where social media truly serves its users, not corporate or political agendas. Work to interoperate with other platforms!
Advocacy Organizations
Continue the tireless work of holding platforms accountable for harm, and advocate for legislation and regulation that protects decentralized platforms from monopolistic practices and ensures equitable access to digital infrastructure. Legislative proposals can count on consensus-based technical protocols to support consumers and break up tech monopolies in social media.
Trust and Safety Experts and Researcher Community
In a time when trust and safety work is often misunderstood or demonized, we hope these experts will continue their vital efforts and bring their expertise to the Fediverse. By shaping community-driven moderation and governance, they can help create spaces where all users feel valued and protected. The Fediverse means interoperability between platforms but also a robust space for innovation, integration and intermediary services like community content moderation and fact checking.
Funders
Step up to fund and support federated social media initiatives. Provide the financial resources necessary to improve governance-by-design and develop innovative solutions for content moderation and trust and safety. This is a critical moment to empower noncorporate alternatives that prioritize the public good. By funding federated infrastructure improvements, research, and collaborative projects, you can help catalyze a shift toward platforms that center users and their rights. Free Our Feeds is one initiative you could get involved with which aims to create a decentralized social media ecosystem that operates in the public interest, free from the control of billionaires and private corporations.
The future of social media depends on reclaiming the power from corporate interests and investing in community-driven, decentralized platforms that prioritize equity, trust, and the well-being of their users.