House of Lords Paves the Way for UK AI Regulation
Date
Late last year, Conservative Peer Lord Holmes introduced an Artificial Intelligence (AI) Regulation Bill into the UK’s House of Lords. The Bill was introduced amidst the backdrop of the EU AI Act negotiations and the UK Government’s own AI working group discussions. Since then, the UK Government has shelved its plans for a code of practice on copyright and AI.
As the Government reflects on its next course of action regarding AI, Lord Holmes’ AI Regulation Bill offers a new path for the MPA to lobby Government to take a more active approach to regulating AI. If enacted the AI regulation Bill will “make provision for the regulation of artificial intelligence; and for connected purposes.” Part of this regulation includes the creation of a specialist AI regulation agency, to ensure all businesses and individuals that develop, deploy, or use AI:
- Keep a record of any third-party data they use and have clear consent and permission to use the data for machine learning
- Inform customers that the products and services they are engaging with have been created using artificial intelligence
- Allow authorised third parties to audit their process and systems.
Members of the House of Lords will have a full debate on the subject on 22 March when the Bill has its Second Reading. While it is unclear whether this Bill will go on to become law, it is a great opportunity for our UK politicians to lay out their arguments as to how the UK should regulate AI. The MPA has been working closely with Lord Holmes to build a raft of support for the Bill. If you would like to lend your support to the Bill by providing a quote, please do not hesitate to contact the MPA’s Chief Public Affairs & Policy Officer Ornella Nsio ([email protected]).
Below is a roundup of the MPA’s key policy recommendations to Government.
AI-generated music must not be granted copyright.
Granting copyright for AI-generated music would provide a material risk of displacing human-created music, and lead to market distortion based on a likely oversaturation by machine-generated music.
Rightsholders reserve the right to decide if their data is ingested.
The freedom of choice is one of the underlying principles of copyright. This choice must also not be undermined by any form of compulsory licensing which would reduce a rights holder’s options to decide on what happens to their data.
All parties involved in machine learning should keep a record of the data they use.
As has been recognised in many jurisdictions as diverse as the European Union and China, transparent record-keeping is vital for a secure and confident operation of AI applications. Knowledge of the ingested musical/literary works is key for potential remuneration mechanisms, infringement procedures, and bias eliminations. Record keeping should be introduced as a mandatory requirement for providers of AI applications that release AI music in the UK regardless of the territory of origin of the AI application or AI creation.
All music generated by AI should be clearly labelled.
Music generated by AI applications must be identified as such to protect consumers from being misled about the artificial nature of AI-generated music.
The UK must introduce Personality Rights
The current legal framework protecting creatives from a misuse of their likeness or personality is fragmented and leaves individuals without legal resources vulnerable to exploitation. The introduction of a specific image and personality right would establish specific provision on the protection of image and personality rights in the UK and offer protection to all individuals.
Stop “Jurisdiction shopping” for training of large language models
Last spring the UK Government decided not to proceed with an extension to the UK’s text and data mining exception. While this was a welcome outcome, some British AI developers have begun outsourcing the training of their Large Language Models (LLMs) to overseas territories with supposedly broader text and data mining exceptions.
While the UK Copyright, Designs and Patents Act 1988 already provides a secondary infringement provision covering the importation of an infringing article; there is some debate as to whether LLMs constitute such an “article”. Legislative clarification under Section 27 of the CDPA would create legal certainty that large language models constitute an “article” in this context and protect the UK market from this type of secondary infringement.
Introducing specific standards for LLMs which operate and generate revenue in the UK as a condition of market access, would also help prevent such jurisdiction shopping. Such standards already exist in other industries such as pharmaceuticals, food, and drinks and would be easily transferable to the creative sector.
For more information please contact:
Ornella Nsio
Chief Policy & Public Affairs Officer, MPA
[email protected]
Share article