Skip to main content

Copyright and AI Consultation: DACS' key recommendations

In February, DACS submitted a response to the UK government’s open consultation on copyright and artificial intelligence. To compile our response, we have drawn on our report AI and Artists’ Work, published in 2024, and held focus groups with artists, cultural institutions and AI experts during the consultation process.

Whilst DACS welcomes the objectives of the consultation to give rightsholders more control over the use of their work in AI training, the government’s preferred route to deliver this through an extended text and data mining exception and rights reservation mechanism is misguided. This has been evidenced by the vocal and widespread concern raised not only by creators and their representatives, but also by the media and publishing industries.

In our response, we recommend the government should not adopt an extended text and data mining exception with a rights reservation system but instead support the UK’s strong copyright framework by encouraging collective and transactional copyright licensing, which benefit both rightsholders and AI developers. In order to facilitate this mutually beneficial outcome, the government must introduce transparency measures that ensure AI developers provide detailed information about copyright protected works used in training and deploying AI models.

  • The government should not adopt an extended text and data mining exception with a rights reservation system. Copyright law already allows creators to ‘opt in’ to having their works used on their own terms.

  • The government should support the UK’s strong copyright framework by encouraging collective and transactional copyright licensing, which benefit both rightsholders and AI developers. Licensing supports fair remuneration for creators and legal certainty for AI developers, thereby increasing uptake and trust in AI products

  • The government must introduce transparency measures that ensure AI developers provide detailed information about copyright protected works used in training and deploying AI models. There should be an appropriate regulatory authority to oversee compliance with these measures and remedies available for rightsholders if detailed information is not provided.

95% of artists want control, credit and compensation for their works being used to train AI models.

AI & Artists Report, 2024
DACS

The problem with opt out

Artists are particularly vulnerable to unauthorised AI training due to the mass availability of their works online. This can lead to situations in which an AI-generated output looks identical to the artist’s own work and competes with them in their marketplace. Some AI developers have trained their models on UK artists’ works without permission and without pay.

The government proposed an opt out style system to give artists control over their work, however this system is flawed. It is impossible for individuals to control ongoing uses of an image made by third parties. For example, an artist can state on their own website that they reserve their rights in their images, or use a robots.txt text file to ‘opt out’ the URL of their site, but if their image is reproduced by a third party without this opt out note, e.g. by being shared on social media, the artist has no course of action to enforce their opt-out. Even if the artist locates a reproduction and asks the social media user to remove their post, the work may already have been used for AI training. Additionally, the opt out system could require significant time to manage, presenting an additional burden on artists.

DACS is concerned about the potential impacts of the government’s proposed route on creators. We believe the government’s approach risks undermining the royalty income that many visual artists rely on to support their practice, whilst inverting the established principles of copyright. Before AI entered the conversation, artists’ incomes had already been squeezed, and their opportunities reduced.In the research we commissioned in 2024 – UK Visual Artists: A survey of earnings and contracts, researchers found that visual artists are some of the lowest paid workers in the creative industries earning on average £12,500 a year, and that their incomes have decreased significantly since 2010.

Copyright royalties are an important part of an artists’ portfolio of earnings, and it is therefore critical that the UK maintains a strong copyright framework, to control uses of their work and earn money from licensing.The government should introduce measures to improve visual artists’ pay and conditions, thereby improving access to and diversity within the sector. Changing the UK’s copyright framework risks driving more artists out, to the detriment of the entire arts and culture ecosystem in the UK, and is a step in the wrong direction.

Licensing underpinned by transparency is mutually beneficial for the visual arts and AI sectors: creators could be fairly remunerated and engage with AI tools without concerns about copyright-related risks, and AI companies can develop engaging commercial products that support creativity.

DACS

Transparency and Licensing

The strongest route to achieving the government’s goals of control, access to data and transparency, is to ensure that AI developers are acting in compliance with copyright law. DACS considers that licensing copyright, through collective or blanket licences coupled with transactional and bespoke licensing for specific uses, will not only deliver the fair rewards to artists and other creators, but bring legal certainty to technology companies, in turn improving public trust in their AI applications.

The barriers to licensing so far have been a lack of transparency over what copyright protected works have been used in AI training, and a lack of good-faith negotiations between AI companies and rightsholders such as artists.

A recent University of Oxford Institute for Ethics in Artificial Intelligence consultation brought together representatives from the creative industries – including DACS – with those in the AI sector, agreeing a set of principles that could enable a mutually beneficial relationship between creators and responsible AI developers, with transparency and licensing at its core.

During the government consultation period, DACS worked with PICSEL and independent researcher and artist Caroline Sinders of Convocation Design and Research, to convene a focus group of artists, arts organisations, legislators and AI experts to outline routes forward in transparency, rights reservation and labelling, for policymakers, legislators and technologists. This report will be released in the coming weeks.

Ultimately, licensing underpinned by transparency is mutually beneficial for the visual arts and AI sectors: creators could be fairly remunerated and engage with AI tools without concerns about copyright-related risks, and AI companies can develop engaging commercial products that support creativity. This would incentivise the creation of AI models and products which operate with principles of consent, control, compensation and transparency.

On the other hand, the text and data mining exception proposed by government would make effective licensing near impossible, and place additional burdens on artists and other freelance rightsholders, who lack the resources to enforce their rights retroactively.

What next?

The strength of feeling across rightsholders groups is clear, evidenced by the broader Make it FAIR campaign , and the 11,000+ responses the consultation received.

Since the consultation closed, DACS has joined other rightsholder groups in co-signing a letter to the Secretary of State for Science, Innovation and Technology, Peter Kyle MP, written by James Frith MP.

We will continue to make the case for a licensing model that gives artists control, working with our colleagues across the creative industries, and we will continue to engage with AI developers and those working with AI to develop responsible and transparent models.


Thank you to all our members and supporters who took the time to complete the copyright consultation or shared our campaign.

Related