Key topics:
- UK artists protest AI copyright law with a silent album.
- Proposed law lets AI use creative work by default, sparking backlash.
- UK urged to push for AI transparency and fair licensing.
Sign up for your early morning brew of the BizNews Insider to keep you up to speed with the content that matters. The newsletter will land in your inbox at 5:30am weekdays. Register here.
Support South Africaâs bastion of independent journalism, offering balanced insights on investments, business, and the political economy, by joining BizNews Premium. Register here.
If you prefer WhatsApp for updates, sign up to the BizNews channel here.
By Parmy Olson and Rosa Prince ___STEADY_PAYWALL___
Paul McCartney is mad. So are Kate Bush, Elton John and about 1,000 other musicians who released a silent album this week â made by recording empty music studios â to protest the UKâs proposed changes to its copyright law. Britain, the country that gave us the Beatles and the Rolling Stones, now wants to make it easier for tech firms to train AI with that creative work, allowing them to use it by default without pay. If creators donât like it, they must opt out.
Unsurprisingly, artists hate the idea, which reverses a fundamental principle of copyright law: You ask for permission before using someoneâs work. Trade groups say the proposed changes would threaten the livelihoods of singers and publishers and lead to the rabid exploitation of work without compensation.
The furor among artists is not a great look for the UK government, nor that it framed its proposals as part of a âconsultationâ with creative industries when it already appeared ready to bow down to technology firms. But thereâs hope for a solution. Maybe even a profitable one. Instead of running headfirst into rewriting the law, the Brits should try experimenting with ways to build a new commercial market for licensing. Focusing on pushing tech firms to be more transparent would be a good start.
Tech giants like Alphabet Inc.âs Google, Meta Platforms Inc. and OpenAI should, for instance, respond to requests from book publishers or movie studios for disclosure of any content used to train an AI model. Once they answer, both sides can start talking about fair compensation. That would build on some of the ad hoc contracts OpenAI and other tech firms have already established with publishers like News Corp. and Axel Springer SE, worth tens of millions of dollars.
âI appreciate thatâs a difficult commercial conversation on all sides,â says Dan Conway, who runs the UK Publishers Association. âThat requires some good management.â
Enter Prime Minister Keir Starmer, who will need to push tech firms to cooperate. The UK government can and should capitalize on its good relations with Google, whose AI chief, Demis Hassabis, is British and has been based in London his entire career.
Britain can also work off a template it already has in privacy law. For years, British citizens have been able to make âsubject access requestsâ to any company, which must disclose any private data it holds on that person. UK-based artists could do the same for tech companies that have scraped their work for AI training, and be assured of an answer.
This does involve much fumbling in the dark. Thereâs virtually no precedent in commercial history â beyond the act of paying for copyrighted work itself â where businesses have been required to pay for resources theyâve already used, retroactively. And the incentives for doing so arenât great without a nudge from authorities.
It also doesnât help that since the Labour party entered government last July, its approach to AI has been somewhat muddled. In part, this was a reaction to what came before. Former Conservative Prime Minister Rishi Sunak, a self-declared tech bro who was educated at Stanford University and maintained a California holiday home, sought to position the UK as a global tech police force.
That ambition became sidelined more recently in favor of AI as a driver for growth, and earlier this month, the UK joined the US in declining to sign an agreement issued by a global AI summit, held in Paris, which pledged an “open” and “ethical” approach to AI.
Now with the launch of its copyright consultation, which closed this week and was aimed at addressing the lack of legal certainty around using copyrighted work for AI, no one in government seems to have considered the optics of picking a fight with famous names from Dua Lipa to Ed Sheeran and Andrew Lloyd Webber.
Furious artists have proved a powerful mobilizing force, and the opposition has jumped on the bandwagon, with Tory Leader Kemi Badenoch describing the proposals as âa mess.â Government insiders meanwhile say it was an error to suggest in the consultation document that the UK preferred its âopt-outâ option, and that no final decisions have been made.
One problem is that copyright falls between both the Department of Culture, led by Lisa Nandy, and the Department of Science and Tech, with Peter Kyle at the helm â some in the creative sphere argue that the former has been putting their case less forcefully than the latter has been promoting big tech.
Letâs hope the noisy protest will spur the UK government to get a grip. Britain shouldnât rush to rewrite its laws, but focus on the unique opportunity it has in hosting some of the worldâs top creative professionals, technology builders and policy makers all in the same place.
It has a chance to build a pioneering licensing market for AI and creative content. It should push tech firms to disclose what works theyâve used, similar to what they already do with personal data. That could help companies exploit high-quality content too. If the UK can make an experiment like that work, it might turn a mess into an opportunity.
Read also:
- đ Parmy Olson: AI friends are taking over â but at what cost?
- đ Greatest hits albums: Once musicâs biggest sellers, now just nostalgic collectibles
- The guy who created Spudâ on the return of his âFrankensteinâ ⊠and a musical â John van der Ruit
© 2025 Bloomberg L.P.