Thomson Reuters names eight Keystone Law partners in its Stand-out Lawyers Guide 2026
Andrea James, Andrew Darwin & Anna McKibbin
Keynote
17 Apr 2026
•7 min read
The UK government has published a coordinated set of responses addressing how to balance copyright protection with AI innovation. These include the Department for Science, Innovation and Technology’s (DSIT) Report on Copyright and Artificial Intelligence (“DSIT Report”), the Economic Impact Assessment, and the House of Lords Communications and Digital Committee’s report on AI and the creative industries.
AI could add approximately £55–140 billion to UK Gross Value Added (GVA) by 2030. The UK’s creative industries are world-leading; from Aardman animations to globally successful games such as Grand Theft Auto, they accounted for 13% of UK services exports in 2023 and are a central pillar of the government’s Industrial Strategy. Against that backdrop, the government’s response is cautious.
No immediate reform to AI and copyright law
The government has moved away from its preferred option of introducing a commercial text and data mining (TDM) exception with an opt-out mechanism for rights holders.
It concluded that stakeholder views were too divided and evidence too limited to justify legislative intervention to copyright law. It has also confirmed that it will not introduce a new regulator or impose additional regulatory duties at this stage.
This means that the existing framework under the Copyright, Designs and Patents Act 1988 (CDPA) continues to apply, including the narrow TDM exception limited to non-commercial research. The continued reliance on this framework leaves both AI developers and rights owners potentially exposed: AI developers face uncertainty around liability and future legislative changes, while creatives and rightsholders remain concerned about enforcement and their ability to monetise their works effectively.
The government has cautiously opted to monitor legislation and case law developments in the US and EU before committing to a definitive UK approach. This “wait and see” strategy leaves the UK without a clear short-term framework and may impact investment at a critical time for UK economic growth.
Licensing, transparency, and digital replicas
The DSIT report identifies three key areas for future focus:
Computer-generated works
The government has indicated a clear policy direction on the treatment of computer-generated works (CGWs). Under section 9(3) of the CDPA, where a literary, dramatic, musical, or artistic work is generated by a computer without a human author, the “author” is deemed to be the person who made the arrangements necessary for its creation.
The government has indicated that protection for CGWs should be removed, meaning outputs without sufficient human effort and skill would not be protected by copyright.
This potential change should not impact on AI-assisted works which involve human intellectual effort. Here, copyright will continue to subsist in the outputs (subject to any underlying third-party rights). The difficult balance will be the implementation of such a change; creative works are increasingly made up of aspects of CGW often generated by tools which have embedded AI features.
Economic trade-offs and international positioning
The economic impact assessment highlights the complex economic trade-offs at the heart of AI and copyright policy but provides limited quantitative certainty.
A more permissive regime for AI training data could support innovation and attract investment. However, weakening copyright protections risks undermining the UK’s creative industries due to uncertainties over licensing and enforcement.
A market-based approach to licensing may offer a pragmatic path forward. While larger players are already entering into licensing arrangements, smaller AI developers may struggle without access to clear, high-quality datasets. Equally, creatives will need confidence that they can negotiate fair terms and enforce them effectively.
Litigation
In the absence of legislative reform, the courts will play a critical role. UK proceedings such as Getty Images v Stability AI have already highlighted evidential and jurisdictional challenges, particularly where training takes place outside the UK. International cases, including Like Company v Google before the Court of Justice of the European Union, are expected to provide further clarity.
While they may shape the practical boundaries of lawful AI training in the near term, huge costly cases such as these do little to support clarity for SMEs and smaller rights holders.
Takeaways
For now, the UK remains in a period of uncertainty. Creatives, rightsholders, and AI businesses must operate within the current legal framework while monitoring developments closely.
AI developers should:
Creatives and rightsholders should:
With the legal framework still evolving, early engagement with these issues will be critical. Those who act now will be better placed to manage risk and take advantage of emerging opportunities as the UK’s approach to AI and copyright continues to evolve.
If you have questions or concerns about AI and copyright, please contact Rebecca Steer.