Encryption and Feminism: We’re Briging The Conversation Online

A follow-up to our Mozilla Festival session on Encryption and Feminism: Reimagining Child Safety Without Surveillance.

Encryption and Feminism: We’re Briging The Conversation Online
Gerda Binder, Hera Hussain, Georgia Bullen, Audrey Hingle, Lucy Purdon, and Mallory Knodel in our MozFest session.

By Audrey Hingle

Our MozFest session on Encryption and Feminism: Reimagining Child Safety Without Surveillance was bigger than a one-hour festival slot could contain. The room filled fast, people were turned away at the door, and the Q&A could have gone on twice as long. Many attendees told us afterwards that this is the conversation they’ve been waiting to have. That feminist perspectives on encryption aren’t just welcome, they’re needed. So we’re opening the circle wider and taking it online so more people can join in.

In the room, we heard reflections that reminded us why this work matters. In feedback forms, attendees told us encryption isn’t only a security feature, it’s “part of upholding the rights of kids and survivors too, now let’s prove that to the rest of the world!” Another participant said they left ready to “be a champion of encryption to protect all.” Someone else named what many feel: “More feminist spaces are needed!”

It quickly became clear that this work is collective. It’s about shifting assumptions, building new narratives, and demanding technology that does not treat privacy as optional or as something only privacy hardliners or cryptography experts care about. Privacy is safety, dignity, and a precondition for seeking help. It is necessary to explore identity, form relationships, and grow up. Privacy is a human right.

We also heard calls for clarity and practicality: to reduce jargon, show people what encryption actually does, and push for privacy-preserving features more generally like screenshot protection and sender-controlled forwarding.

Participants also reminded us that encryption must account for disparity and intersectionality. Surveillance is not experienced equally. Some communities never get to “opt in” or consent at all. Feminist principles for encryption must reflect that reality.

And importantly, we heard gratitude for the tone of the session: open, candid, grounded, and not afraid to ask hard questions. “Normalize the ability to have tricky conversations in movement spaces,” someone wrote. We agree. These conversations shouldn’t only happen at conferences, they should live inside policy rooms, product roadmaps, activist communities, parenting forums, classrooms.

So let’s keep going.

New Virtual Session: Encryption and Feminism: Reimagining Child Safety Without Surveillance

🗓️ Feb 10, 4PM GMT, Online

Whether you joined us at MozFest, could't make it to Barcelona, or were one of the many who could not get into the room, this session is for you. We are running the event again online so more people can experience the conversation in full. We will revisit the discussion, share insights from the panel, and walk through emerging Feminist Encryption Principles, including the ideas and questions raised by participants.

Speakers will include Chayn’s Hera Hussain, Superbloom’s Georgia Bullen, Courage Everywhere’s Lucy Purdon, UNICEF’s Gerda Binder, and IX’s Mallory Knodel, Ramma Shahid Cheema and Audrey Hingle.

Help us grow this conversation. Share it with friends and colleagues who imagine a future where children are protected without surveillance and where privacy is not a privilege, but a right.

We hope you’ll join us!

Related: If you care about privacy-preserving messaging apps, Phoenix R&D is inviting feedback through a short survey asking for input on what features matter most for those in at-risk contexts.


New Book! Hidden Influences: How algorithmic recommenders shape our lives

Hidden Influences: How algorithmic recommenders shape our lives by Dr. Luca Belli

New book from IX client Dr. Luca Belli looks at how recommender systems function, how they are measured, and why accountability remains difficult. Luca draws on his experience co-founding Twitter’s ML Ethics, Transparency and Accountability work, contributing to standards at NIST, and advising the European Commission on recommender transparency.

Now available via MEAP on Manning. Readers can access draft chapters as they are released, share feedback directly, and receive the final version when complete. Suitable for researchers, policy teams, engineers, and anyone involved in governance or evaluation of large-scale recommendation systems. It is also written for general readers, with no advanced technical knowledge required, so when you're done with it, hand it to a curious family member who wants to understand how algorithms decide what they see.

Support the Internet Exchange

If you find our emails useful, consider becoming a paid subscriber! You'll get access to our members-only Signal community where we share ideas, discuss upcoming topics, and exchange links. Paid subscribers can also leave comments on posts and enjoy a warm, fuzzy feeling.

Not ready for a long-term commitment? You can always leave us a tip.

Become A Paid Subscriber

Open Social Web

Internet Governance

Digital Rights

Technology for Society

Privacy and Security

Upcoming Events

Careers and Funding Opportunities

United States

Global

What did we miss? Please send us a reply or write to editor@exchangepoint.tech.

💡
Want to see some of our week's links in advance? Follow us on Mastodon, Bluesky or LinkedIn, and don't forget to forward and share!