Sidelined UX Research: Lessons From Meta’s Senate Hearing

Whistleblower testimony reveals how corporate pressure can twist research and undermined its integrity.

Sidelined UX Research: Lessons From Meta’s Senate Hearing
Photo by Robby McCullough / Unsplash

By Michal Luria, Ph.D. Research Fellow at the Center for Democracy & Technology

It is not every day that user experience (UX) researchers find themselves testifying before the United States Senate. Yet that is precisely what happened two weeks ago, when former Meta employees, Jason Sattizahn and Cayce Savage, stepped forward as whistleblowers. Their testimony raised serious allegations about how safety research on virtual reality (VR) products was handled within Meta. It also raises broader questions about the challenges and hurdles that UX researchers might face when their findings conflict with corporate priorities.

The hearing came shortly after The Washington Post published a story about how safety research was allegedly shaped, constrained, ignored or suppressed by Meta. If true, this significantly undermines research integrity; deleting concerning data, misrepresenting findings, and removing inconvenient findings directly violate the most fundamental research ethics code.

What made last week’s testimony striking was not just the alleged misconduct itself, but the glimpse into the unique and complex challenges that UX researchers face when conducting research inside a tech giant. Researchers are often caught in the middle of competing pressures: leadership prioritizing brand reputation, legal teams focused on limiting liability, and product teams pushing for growth and engagement. As a result, the potential for conflicts of interest with research findings are an everyday reality, and even well-intentioned research can end up being misrepresented, skewed or fully ignored.

UX research and especially safety research teams are also commonly under-resourced, and many report heavy workloads, limited staffing, and lack of support when their findings clash with business goals. According to Dr. Sattizahn’s testimony, the one constant during his six years at Meta was “not having enough money to build for safety.” 

But the absence of research isn’t even the worst outcome. The truth is that misleading research is worse than no research at all: a gap in research is easily identifiable, but research that is riddled with malpractice is extremely difficult to identify and point out.  

This all adds up to a host of pressures that push against good internal UX research, but giving up on the profoundly important function of UX would be a devastating outcome, as many company researchers work extremely hard to surface safety risks and push for change within industry. Instead, companies need to take deliberate steps towards protecting and empowering their research teams. 

Where Does UX Research Go From Here? 

First, companies must recognize that having research teams is not enough. At a time when companies are pouring millions into hiring “superstar” AI researchers, it’s important to emphasize that research integrity isn’t about prestige research hires; it demands leadership in all parts of a company that respect research findings, even when those findings are uncomfortable. It also requires clear accountability structures, including internal auditing processes.

Second, there must be greater transparency and public visibility into research methods and approaches. Sharing findings and research processes (in participant privacy-protecting ways) will allow the public, regulators, and other stakeholders to understand people’s experiences of products, their shortcomings, and adopt conclusions and recommendations. This kind of openness would not only help build trust with companies and their internal research outcomes, but would ensure that critical safety issues are not hidden behind corporate walls.

Third, industry should establish ways in which independent researchers can gain access to company data in order to conduct their own research. Recent actions, including the closing of Meta’s CrowdTangle program last year, a critical tool for conducting research and providing transparency using company data, are a step back in that regard. Providing vetted, external researchers with datasets allows for both verification and replication of internal research findings, as well as additional independent research into important safety questions.

As Cayce Savage noted in her testimony, the purpose of UX research is generally to advocate for users, especially for vulnerable populations like children. This mission requires that researchers have the freedom to ask difficult questions, apply appropriate methodologies, and communicate findings clearly and accurately. When those are compromised, the researchers themselves, and the millions of people relying on technology to be safe, will likely bear the cost.


Now available for download: Global Governance of Low Earth Orbit Satellites 🚀 Featuring IX's Mallory Knodel

The new book Global Governance of Low Earth Orbit Satellites features a chapter from Mallory, “Insights from the Internet – How to Govern Outer Space” (Section IV, p. 207), where she compares the challenges of governing a borderless internet with those of managing an increasingly crowded orbit, drawing lessons from multistakeholder internet institutions like ICANN and the IGF.

The volume as a whole brings together international experts to tackle the legal, policy, and technical dimensions of Low Earth Orbit satellites, from digital sovereignty and cybersecurity to spectrum allocation and space sustainability. It’s both a scholarly resource and a practical policy guide for shaping sustainable, inclusive, and secure governance of emerging space-based internet systems.

Support the Internet Exchange

If you find our emails useful, consider becoming a paid subscriber! You'll get access to our members-only Signal community where we share ideas, discuss upcoming topics, and exchange links. Paid subscribers can also leave comments on posts and enjoy a warm, fuzzy feeling.

Not ready for a long-term commitment? You can always leave us a tip.

Become A Paid Subscriber

Internet Governance

Digital Rights

Technology for Society

Privacy and Security

  • This is a weird story: The US Secret Service says it disrupted a large SIM farm in the New York area ahead of the UN General Assembly, seizing 300 SIM servers and 100,000 SIM cards allegedly tied to nation-state actors, organized crime, and terrorist groups. Experts and commentators, however, are skeptical. SIM farms are typically used for spam, scams, fake accounts, or cheap international calls, not large-scale infrastructure attacks. https://www.schneier.com/blog/archives/2025/09/us-disrupts-massive-cell-phone-array-in-new-york.html 

Upcoming Events

Careers and Funding Opportunities

Opportunities to Get Involved

What did we miss? Please send us a reply or write to editor@exchangepoint.tech.

💡
Want to see some of our week's links in advance? Follow us on Mastodon, Bluesky or LinkedIn, and don't forget to forward and share!