Standards for a Responsible AI Future: Reflections on the Seoul Statement
The statement comes at a time when principles of justice, dignity, and human rights are increasingly politicized, questioned, or treated as negotiable.
By Jacobo Castellanos, Coordinator of the Technology, Threats, and Opportunities team at WITNESS
On December 2, the ITU, ISO and IEC issued their Seoul Statement: a vision for AI standards that account for global contexts, rights impacts, and real-world harms.
The Seoul Statement includes four core commitments:
- Integrating socio-technical perspectives into standards: ensuring AI standards address not just algorithms and data, but real-world impacts on people, societies, and the environment.
- Embedding human rights and universal values: protecting dignity, privacy, fairness, and non-discrimination throughout AI design and governance.
- Building an inclusive, multi-stakeholder community: enabling governments, industry, researchers, and civil society to shape global AI norms together.
- Strengthening public–private collaboration and capacity-building: reducing global inequalities so all countries and communities can meaningfully benefit from AI.
This vision is not only welcome; it is a meaningful signal of hope.
It comes at a time when principles of justice, dignity, and human rights—once a shared reference point for international cooperation and for civil society’s engagement with governments and companies—are increasingly politicized, questioned, or treated as negotiable.
Why this matters
Standards, like regulation, form the structural base of the AI stack. By committing to explicitly consider human rights and real-world impact into standards development, the ITU, ISO, and IEC can help effectively steer AI’s impact toward protecting human rights, strengthening the information ecosystem, and fostering responsible innovation.
Human rights and civil society groups have been calling for this shift for years (see for example OHCHR’s latest report). Standards alone won’t solve every AI concern, but they can create a pathway, together with regulation and tooling, that will lead towards rights protections and limiting misuse. At WITNESS, we work at the intersection of technology and human rights, and we have seen this firsthand in our work with the Coalition for Content Provenance and Authenticity (C2PA), where a harm assessment continues to shape both the design of the standard and the ecosystem around it. By developing Content Credentials, a form of tamper-evident metadata that travels with an image, video, or audio file to show when, where, and how it was created or modified, C2PA offers a practical example of how standards can embed rights considerations from the ground up.
From Promise to Practice
While this vision is promising, a pressing question remains: How will these commitments be translated into action?
The Seoul Statement was presented during a two-day summit held in Seoul, but concrete plans for its implementation were not shared. Representatives from the ITU, ISO, and IEC did not publicly outline how this vision would be realized, and no details were provided regarding budgets, mechanisms, timelines, or accountability measures.
Standards work is inherently slow and resource-intensive. Incorporating socio-technical and human rights considerations adds another layer of complexity that requires significant investment in expertise, time and financial support. Without such investment, the Seoul Statement risks becoming a symbolic gesture rather than a meaningful turning point.
A notable concern was the limited presence of civil society at the Seoul summit. Multistakeholder participation was frequently mentioned, yet only a few human rights groups attended. Government and industry voices were far more visible, which is too narrow a basis for defining future AI norms. For the SDOs’ vision to carry real weight, civil society must be involved consistently, adequately resourced, and included from the beginning, not added in as an afterthought.
A Call to Stay Engaged
Still, there is reason for cautious optimism. The Seoul Statement represents an important first step, formally issued by institutions that will play a fundamental role in shaping the future of AI. By acknowledging that AI standards cannot be “just technical” and must be grounded in human rights and societal wellbeing, it creates a platform to push for meaningful change.
At WITNESS, we will continue to be actively involved in the C2PA, where we co-chair its Threats and Harms Task Force, and we will engage with the World Standards Cooperation’s AI and Multimedia Authenticity Standards Collaboration (ITU, IEC and ISO) as it positions AI standards as a powerful tool for regulation development and enforcement.
We call on civil society, researchers, regulators and funders to remain engaged, not only when milestones are announced, but through the long, technical, often opaque process of drafting, reviewing and implementing standards. We must also hold the ITU, ISO, and IEC accountable to their own vision, while working to extend this commitment to other national and international SDOs, and to the remaining building blocks that sit atop the foundations of regulation and standards in the AI ecosystem.
Holiday Shopping Reminder!
You can support independent bookstores and get great deals without lining the pockets of billionaires. Help support Internet Exchange by shopping our bookshop The Stack on Bookshop.org. The Stack curates books that connect the dots between code and culture, protocol and protest, theory and practice.
Support the Internet Exchange
If you find our emails useful, consider becoming a paid subscriber! You'll get access to our members-only Signal community where we share ideas, discuss upcoming topics, and exchange links. Paid subscribers can also leave comments on posts and enjoy a warm, fuzzy feeling.
Not ready for a long-term commitment? You can always leave us a tip.
This Week's Links
Open Social Web
- A year into integrating Germ with Bluesky’s AT Protocol, Tessa Brown describes how an encrypted messenger evolved into part of a growing open-protocol social ecosystem. https://rhosf.leaflet.pub/3m7kwii6nws2c
- Hannah Aubry argues that Elon Musk’s decision to block the European Commission from advertising on X highlights why governments and public institutions must move away from corporate-controlled platforms and toward open, decentralised social networks like Mastodon. https://blog.joinmastodon.org/2025/12/the-world-needs-social-sovereignty
Internet Governance
- Don't let anyone tell you that the European Commission's €120 million enforcement against Elon Musk’s X under the Digital Service Act (DSA) is about censorship or about what speech users can post on the platform, writes Daphne Keller in Tech Policy Press. https://www.techpolicy.press/the-eus-fine-against-x-is-not-about-speech-or-censorship
- Elon Musk's X bans European Commission from making ads after €120m fine. https://www.bbc.co.uk/news/articles/c0589g0dqq7o
- At the November 2025 TPAC event, Tara Whalen reported on the W3C/IAB workshop on Age-Based Restrictions on Content, highlighting challenges from diverse regulations, technical approaches, and enforcement roles. https://www.youtube.com/watch?v=0lwsLg1Aa9g
- As our dependence on digital systems deepens, even brief outages can spiral into social paralysis; Gregory Asmolov argues that our next urgent task is to build the psychological, social and practical resilience needed to function when the network goes dark. https://www.linkedin.com/pulse/why-disconnective-resilience-should-our-next-priority-gregory-asmolov-kp7we
- The Chinese tech giant Huawei is under scrutiny for its conduct in an international standards organization. A Huawei patent manager admitted to obscuring his lack of technical expertise to access an influential global body that sets the standards for Wi-Fi technology. https://www.politico.eu/article/huawei-staffer-hides-role-access-key-global-tech-group-quits-wifi-email
- France is pushing the EU to adopt common, union-wide technical standards for labeling and detecting AI-generated content, warning that reliance on proprietary tools from mostly non-European developers could create dangerous dependencies in critical areas such as justice, elections and journalism. https://www.mlex.com/mlex/articles/2419673/genai-developers-see-france-push-for-common-eu-labelling-requirements
- The European Commission has created an expert group to support its roadmap on encryption. https://ec.europa.eu/transparency/expert-groups-register/screen/expert-groups/consult?lang=en&groupID=4005
- In this recorded talk, Laura Lazaro Cabrera of the Center for Democracy and Technology Europe and Daniel Leufer of Access Now explain why the European Commission’s Digital Omnibus is not a simple technical fix and weakens key EU digital laws. https://www.youtube.com/watch?app=desktop&v=ChUl6SN84qY&t=42s
- Europe’s bid to “simplify” its digital rules collides with its ambitions for tech sovereignty, raising the question: is the EU safeguarding innovation or quietly dismantling its own digital rights framework? https://www.techpolicy.press/what-is-europe-trying-to-achieve-with-its-omnibus-and-sovereignty-push
- When it comes to AI, Europe risks making the same mistakes of the past said Frederike Kaltheuner in European Parliament. https://www.linkedin.com/posts/frederike-kaltheuner_europes-sovereignty-debate-has-a-blind-spot-ugcPost-7398890436897177600-R_kB/?rcm=ACoAAAF2vmUBb7nX9-vJJu_5XpVoPz9DQWdC2iM
- Two ARMOR side meetings in Montreal explored the scale of network interference and the practical work needed to improve end-to-end resilience. Interested in getting involved in censorship resilience work in the IETF? Open tasks listed at the end. https://mailarchive.ietf.org/arch/msg/armor/Ra3ogsNuAnykRW1axWB2UynRj-E/
- UNICEF warns that social media age bans alone will not keep children safe online and may even push them into less regulated, riskier platforms. https://www.unicef.org/press-releases/age-restrictions-alone-wont-keep-children-safe-online
- John Minnich argues that US China rivalry is reshaping how power dynamics influence the supply of technology to less developed countries, potentially boosting access to advanced tech in the Global South unless a future grand bargain cools competition and reduces incentives to share it. https://link.springer.com/article/10.1007/s43508-025-00125-9
- Telcos have little appetite for investing in new 6G hardware, pushing Ericsson and Nokia toward software-led strategies and as the traditional “G-cycle” of major infrastructure upgrades comes to an end. https://www.lightreading.com/6g/ericsson-and-nokia-get-set-for-the-end-of-the-gs
- Chattanooga’s city-owned fiber network has generated a staggering $5.3 billion in community benefits since 2011, positioning the city not just as America’s first “Gig City” but as the nation’s emerging “Quantum City.” communitynets.org/content/chattanoogas-municipal-fiber-network-has-delivered-53-billion-community-benefits-new-study
Digital Rights
- The UK has quietly rolled out a mandatory digital-only immigration status system, making migrants the first population forced to rely solely on digital ID to prove their rights, and new research shows this shift is already causing exclusion, stress, and barriers to work, housing, and travel. https://www.openrightsgroup.org/publications/exclusion-by-design-digital-identification-and-the-hostile-environment-for-migrants
- This position paper argues that the distribution of nonconsensually collected nude images by researchers perpetuates image-based sexual abuse and that the machine learning community should stop the nonconsensual use of nude images in research. https://openreview.net/pdf?id=Ev5xwr3vWh
- New anonymous phone carrier wants to offer cellular service that makes near-total mobile privacy the permanent, boring default of daily life in the US. https://www.wired.com/story/new-anonymous-phone-carrier-sign-up-with-nothing-but-a-zip-code
- A developer is suing Trump administration officials, alleging they pressured Apple to remove his app, ICEBlock, which let users report sightings of ICE agents, violating his free-speech rights. https://www.nytimes.com/2025/12/08/business/apple-iceblock-lawsuit.html
- An inspiring group of feminist writers from across the world curated this collection of personal narratives that make up this anthology "Unyielding: Personal Essays from Women Human Rights Defenders". https://www.genderit.org/resources/unyielding-personal-essays-women-human-rights-defenders
- Tanzania’s post-election tensions have taken a new turn after Instagram removed or restricted the accounts of two prominent activists, Mange Kimambi and Maria Sarungi-Tsehai, following their posts showing alleged killings and disappearances in the wake of the disputed 29 October poll. https://www.youtube.com/watch?v=Q39_BThXvvY
- Major social platforms like X, Meta, YouTube, Google Search and Substack function as a coordinated media ecosystem advancing far-right interests, argues Robin Berjon. https://berjon.com/fascintern-media
- Bread&Net 2025 brought over 550 digital rights actors from West Asia and North Africa together for three days of discussion on surveillance, AI, censorship, and community power. https://us14.campaign-archive.com/?u=826654fd38268e580b78623b6&id=531b8bf214
- Rosanna Garcia, a professor of entrepreneurship who studies social enterprises, explains Open AI’s transition to a public benefit corporation using KPop Demon Hunters. “OpenAI isn’t HUNTR/X or even the Saja Boys after its recent corporate restructuring. It’s more like the soul-stealing Gwi-Ma, who wins by pretending to deliver good.” https://www.inquirer.com/opinion/commentary/kpop-demon-hunters-openai-artificial-intelligence-resilience-20251208.html?id=gy5UOMv7Kiz87
- What is a social enterprise? https://intangible.ca/2025/10/02/non-profit-vs-for-profit-structuring-a-social-enterprise
Technology for Society
- Instacart is running large-scale pricing experiments that mean different customers pay different prices for the exact same groceries, at the same time, from the same store. https://groundworkcollaborative.org/work/instacart
- New social network from New_Public, Roundabout is a community space, built from the ground up with community leaders and neighbors. https://newpublic.substack.com/p/introducing-roundabout-built-for
- Small changes to the tone of posts fed to users of X can increase feelings of political polarisation as much in a week as would have historically taken at least three years, research has found. https://www.theguardian.com/technology/2025/nov/27/partisan-x-posts-increase-political-polarisation-among-users-social-media-research
- The judge in OPEN TECHNOLOGY FUND v. LAKE ruled that USAGM likely broke the law by refusing to release OTF’s FY 2025 funding and therefore ordered USAGM to immediately disburse the full amount OTF requested. https://www.courtlistener.com/docket/69765042/open-technology-fund-v-lake
- In 2024, the Internet Society received support from the ARIN Community Grant Program to explore new tools that strengthen the resilience and trustworthiness of the Internet’s routing system, and the results are in. https://pulse.internetsociety.org/blog/exploring-the-potential-of-rpki-signed-checklists-the-results-are-in
- New research finds that when you account for the pressure and downsides people feel from not using popular apps, TikTok and Instagram shift from appearing beneficial to actually reducing overall wellbeing—even for the people who use them most. https://www.aeaweb.org/articles?id=10.1257/aer.20231468
- Social media and AI are like Soma, the fictional drug in “Brave New World” that creates a sense of comfortable complacency, writes Ken Bayer, who warns we’ve been worrying about the wrong dystopia. https://medium.com/@kenbayer/we-were-worried-about-the-wrong-dystopia-4e24d4ce7f7d
- Tech For Palestine introduces Thaura, your ethical AI companion. A platform that actually respects your values, protects your data, and tells the truth. https://updates.techforpalestine.org/announcing-thaura-your-ethical-chatgpt-alternative
- The Financial Times is the latest mainstream newspaper to discover the existence of the Network State cult writes Gil Duran. https://www.thenerdreich.com/financial-times-wr
- Tech elites are starting their own for-profit cities to escape from regulation and ‘failing’ democracy. https://www.ft.com/content/b127ee7a-5ac4-4730-a395-c9f9619615c7?shareType=nongift
- Future brands Country Life, Tom’s Guide, The Week and The Week Junior, Techradar, Money Week, Marie Claire and dozens more are shifting focus to engaging audiences without relying on Google, as search traffic declines. https://pressgazette.co.uk/publishers/magazines/future-takes-action-on-google-zero-as-revenue-declines
Careers and Funding Opportunities
- AI4ALL: Head of Revenue and Philanthropy. Remote US. https://ai4all.applytojob.com/apply/qUs7qk5GPn/Head-Of-Revenue-And-Philanthropy?source=career+page
- GitLab Foundation: Program Coordinator. Remote US or Colombia. https://www.gitlabfoundation.org/jobs
- Digital Futures Lab: Communications Lead. Goa, India. https://www.linkedin.com/posts/digital-futures-lab_hiring-activity-7403670045366747137-a1Vv/?rcm=ACoAAAF2vmUBb7nX9-vJJu_5XpVoPz9DQWdC2iM
- Ripe NCC. Amsterdam, NL.
- Learning Experience Designer. https://ripencc.recruitee.com/o/learning-experience-designer
- (Junior) Technical Trainer (Network Engineer) https://ripencc.recruitee.com/o/junior-technical-trainer-network-engineer
- (Senior) System and Network Engineer https://ripencc.recruitee.com/o/senior-system-and-network-engineer
- OpenDemocracy and Tech Policy Press: Tech Reporting Fellowship. London Prefered, UK. https://www.opendemocracy.net/en/job-opportunity-tech-reporting-fellowship
- Chayn: Translation and Localisation of Trauma-informed Content, Arabic and Turkish. Remote. https://chayn-cio.breezy.hr
Opportunities to Get Involved
- The APC Feminist Tech eXchange (FTX) invites individuals, collectives, and groups to contribute their writing on feminist technology organising and responses to the environmental crisis. December 14. https://genderit.org/resources/call-pitches-returning-roots-feminist-technology-movements-organising-and-responding
- Are you doing amazing work related to data privacy or public policy? Privacy and Public Policy Conference are accepting abstract submissions for oral, lightning, and poster presentations, and the deadline is December 15. https://privacypublicpolicy-conference.github.io/website
- Comments on the ITU SG’s Fourth Draft Report for WTPF-26, published in the lead-up to the January Council Working Group sessions in Geneva, are due December 18. https://www.itu.int/md/S24-WTPF26PREP-R-0004/en
- SG21 created the AHG-EAI to examine future directions for Embodied AI (EAI) standardization, following discussions at the ITU workshop. For those interested in Embodied AI, and in particular the ITU's Ad Hoc Group EAI in SG21, please note that the next (virtual and possibly final) meeting is scheduled for December 18, 6am EST. https://www.itu.int/en/ITU-T/studygroups/2025-2028/21/Documents/AHGs/2025-10_TOR-AHG-EAI.pdf
What did we miss? Please send us a reply or write to editor@exchangepoint.tech.
Comments ()