Skip to main content

AI [Regulation] Bill debated in House of Lords

A group of people sitting in an amphitheater where someone in the middle is speaking
UK House of Lords

The AI [Regulation] Bill has been tabled by Lord Holmes of Richmond, and covers IP obligations, transparency and labelling, as well as wider regulation of AI across society.

Today, the House of Lords held the second reading of the AI [Regulation] Bill, which seeks to provide a regulatory framework for AI in the UK, including obligations around Intellectual Property. DACS is grateful to Lord Holmes of Richmond for tabling this important bill, and is encouraged by the principles of transparency, consent and remuneration that underpin it.

Introducing the bill, Lord Holmes of Richmond underlined the need for AI regulation, and emphasised the principles that have informed the bill:

If we are to secure the opportunities and control the challenges, it’s time to legislate, it’s time to lead. Principles-based, outcomes focused, input-transparent, permissioned, understood and where applicable, paid for.

Lord Holmes of Richmond

Speaking specifically to the challenges posed to IP by unregulated AI, Lord Holmes said:

"It is critical to understand that [creatives and copyright holders] want to be part of this AI transformation, but in a consented, negotiated, paid-for manner. As Dan Guthrie, director-general of the Alliance for Intellectual Property, put it, it is extraordinary that businesses together worth trillions take creatives’ IP without consent and without payment, while fiercely defending their own intellectual property. This Bill will change that."

AI regulation and visual artists

There was vocal support for visual artists from Lord Freyberg, who cited DACS’ AI Report. Lord Freyberg referenced high-profile cases of artists’ work allegedly being used to train AI models, without consent, credit or remuneration. He called for a clause that specifically references remuneration where works are used – with consent – for training, along with a mandate for transparency on training data:

If artists’ IP is being used to train these models it is only fair they be credited, compensated and able to opt-out… While the bill references IP, artists would have welcomed a specific clause on remuneration, and an obligation for owners of copyrighted material used in AI training to be paid. It is therefore critical to maintain a record of every work that AI applications use, particularly in order to validate the artist’s permission… The UK ought to adopt a mandate similar to that in the EU that requires companies to keep track of the content their applications have ingested.

Lord Freyberg

DACS’ work on AI regulation

In advance of this session, DACS was one of a number of organisations from across the creative industries invited to feedback on the bill and provide recommendations to protect the IP rights of creators.

In January 2024, DACS released its AI Report off the back of a survey of 1000 artists and artists’ representatives. The survey results indicate that there is significant concern amongst artists that unregulated AI would negatively impact their careers, future opportunities and copyright, and that the rapid development of the technology has created a skills shortage. Further, there is strong support for a licensing based solution to the challenges posed by AI.

Read more

Related