Mia Sato is platforms and communities reporter with five years of experience covering the companies that shape technology and the people who use their tools.
As policy makers in the UK weigh how to regulate the AI industry, Nick Clegg, former UK deputy prime minister and former Meta executive, claimed a push for artist consent would “basically kill” the AI industry.
Speaking at an event promoting his new book, Clegg said the creative community should have the right to opt out of having their work used to train AI models. But he claimed it wasn’t feasible to ask for consent before ingesting their work first.
“I think the creative community wants to go a step further,” Clegg said according to The Times. “Quite a lot of voices say, ‘You can only train on my content, [if you] first ask’. And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data.”
“I just don’t know how you go around, asking everyone first. I just don’t see how that would work,” Clegg said. “And by the way if you did it in Britain and no one else did it, you would basically kill the AI industry in this country overnight.”
The comments follow a back-and-forth in Parliament over new legislation that aims to give creative industries more insight into how their work is used by AI companies. An amendment to the Data (Use and Access) Bill would require technology companies to disclose what copyrighted works were used to train AI models. Paul McCartney, Dua Lipa, Elton John, and Andrew Lloyd Webber are among the hundreds of musicians, writers, designers, and journalists who signed an open letter in support of the amendment earlier in May.
The amendment — introduced by Beeban Kidron, who is also a film producer and director — has bounced around gaining support. But on Thursday members of parliament rejected the proposal, with technology secretary Peter Kyle saying the “Britain’s economy needs both [AI and creative] sectors to succeed and to prosper.” Kidron and others have said a transparency requirement would allow copyright law to be enforced, and that AI companies would be less likely to “steal” work in the first place if they are required to disclose what content they used to train models.
In an op-ed in the Guardian Kidron promised that “the fight isn’t over yet,” as the Data (Use and Access) Bill returns to the House of Lords in early June.