The UK Struggles to Balance AI Innovation and Creative Protection

The UK, a global hub for both AI and the arts, struggles to balance tech innovation and protecting a creative sector increasingly threatened by AI trained on copyrighted works.

threading the needle
Photo by Vasilica Ciocan / Unsplash

By Audrey Hingle, also published in techpolicy.press.

Perhaps nowhere is the tension between content creators and generative AI technologies playing out more acutely than in the United Kingdom. Home to a world-renowned creative sector that contributes £126 billion per year to the economy —roughly 5.7% of GDP — and supports 2.4 million jobs, the UK has long punched above its weight in music, publishing, design, film, advertising, and gaming. These industries rely on robust intellectual property protections and human creativity, making them especially vulnerable to AI systems that are trained on vast amounts of copyrighted content, often without permission or compensation.

At the same time, the UK sees leadership in AI as essential to its economic future. The tech sector contributes over £150 billion to the economy and employs 1.7 million people. Policymakers are positioning AI as a cornerstone of digital innovation, foreign investment, and economic growth. But the UK faces a strategic bind: it is difficult to compete with the scale and dominance of US and Chinese tech giants. This puts added pressure on the government to make the UK an attractive destination for AI development. The recent passage of the Data Use and Access Bill without an artist-backed amendment requiring disclosure of copyrighted training data and the now-completed consultation on AI and copyright highlight the ongoing challenge of balancing tech-sector growth with protections for the creative industries.

Recent Policy Flashpoints

The Data Use and Access Bill (June 2025)

The Data Use and Access Bill was introduced to modernize the UK’s data infrastructure and encourage innovation across sectors, including AI. During its passage through Parliament, an amendment championed by artists and Baroness Beeban Kidron proposed requiring AI developers to disclose whether copyrighted materials were used in training datasets. Supporters saw this as a basic measure of transparency and accountability. Opponents, including industry lobbyists, argued it would deter AI investment by creating regulatory uncertainty and compliance burdens. Sir Nick Clegg, former president of global affairs at Meta, argued that asking permission from all copyright holders would "kill the AI industry in this country." The amendment was ultimately excluded from the final legislation, a decision that many in the creative sector interpreted as the government siding with tech firms over creators.

In parallel, the UK government conducted a consultation to clarify how copyright law applies to AI training. The government consultation proposes allowing AI models to be trained on copyrighted material unless its individual owners choose to opt out. Developers must offer transparency and creators have a way to opt out or license their work: 

The consultation also proposes new requirements for AI model developers to be more transparent about their model training datasets and how they are obtained. For example, AI developers could be required to provide more information about what content they have used to train their models. This would enable rights holders to understand when and how their content has been used in training AI.  

Many creators argued that this placed the burden of enforcement on individuals, rather than establishing a meaningful consent framework. The consultation attracted widespread public interest, including a silent protest album released by more than 1,000 musicians as a symbolic rejection of the proposals. While the government framed the consultation as an effort to balance competing needs, critics saw it as further evidence that AI growth is being prioritized over creative rights.

What Makes This So Difficult: Competing Pressures and Real Trade-Offs

The UK is caught between two strategic priorities that increasingly pull in opposite directions. On one side is the pressure to stay competitive in a global AI race, where firms choose jurisdictions based on regulatory clarity, access to data, and innovation-friendly policies. On the other is a creative sector that generates significant economic value and global cultural influence, but whose future depends on enforceable rights and control over how creative work is used.

Copyright law was not built for machine learning. Terms like “fair use,” “opt-out,” and “transparency” lack consensus and are hard to enforce across borders. With the EU and US pursuing diverging approaches, the UK lacks a clear model to follow. Decisions made now will set precedents not just for who benefits from AI, but for whose work is valued and protected.

Where This Might Go Next

The government is expected to respond to the consultation later this year, a decision that could shape future regulation or guidance around AI training data. Some experts are calling for an opt-in system that centres consent, rather than the burden of opting out. Others (most vocally Nick Clegg) warn that anything more restrictive than the current approach could stifle AI innovation in the UK.

The UK has a decision to make: whether to prioritize tech growth at all costs, or to build a more balanced framework that values creativity as much as computation. This could be a chance to lead internationally by developing a workable licensing framework that protects rights holders and provides legal certainty for developers. It will take policy creativity to move beyond the binary of Big Tech vs Big Content, but getting it right could set a meaningful precedent for the future.


Mallory on The PVT Show! 🎙️

IX’s Mallory Knodel joined leadership and mindset coach Poonam Vijay Thakkar to talk about AI, public interest technology, encryption, and internet governance.

They discussed how to set rules for AI, why collaboration matters, and what young people need to know about tech and human rights. Watch below on YouTube, or listen on Apple Podcasts, Spotify, Amazon, Pocketcast or wherever you get your podcasts.

Support the Internet Exchange

If you find our emails useful, consider becoming a paid subscriber! You'll get access to our members-only Signal community where we share ideas, discuss upcoming topics, and exchange links. Paid subscribers can also leave comments on posts and enjoy a warm, fuzzy feeling.

Not ready for a long-term commitment? You can always leave us a tip.

Become A Paid Subscriber

Internet Governance

Digital Rights

Technology for Society

Privacy and Security

Upcoming Events

Careers and Funding Opportunities

United States

Global

Opportunities to Get Involved

  • Bread&Net 2025: call for proposals now open! Bread&Net is a regional forum on digital rights and social justice in West Asia and North Africa, and sessions are on everything from AI and surveillance to movement-building, wellbeing, and public interest tech. Submissions due July 15. https://breadandnet.secure-platform.com/site 

What did we miss? Please send us a reply or write to editor@exchangepoint.tech.

💡
Want to see some of our week's links in advance? Follow us on Mastodon, Bluesky or LinkedIn, and don't forget to forward and share!

Subscribe to Internet Exchange

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe