Subscribe for full access to The Hollywood Reporter
Subscribe for full access to The Hollywood Reporter
„Our position is simple: AI should support artists, not replace them,“ SoundCloud CEO Eliah Seton said in a letter published Wednesday.
By Ethan Millman
Music Editor
SoundCloud has issued an update to its terms of service, days after the music platform had caught heat from musicians and advocates over a previous update on its policy on artificial intelligence training.
In an open letter published Wednesday, SoundCloud CEO Eliah Seton wrote that the company “has never used artist content to train AI models,” echoing a statement a SoundCloud representative shared with The Hollywood Reporter last week.
“Not for music creation. Not for large language models. Not for anything that tries to mimic or replace your work. Period,” Seton wrote. “We don’t build generative AI tools, and we don’t allow third parties to scrape or use artist content from SoundCloud to train them either.”
The backlash comes from a February 2024 update to SoundCloud’s terms of service to say that users “explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services.”
That update started making the rounds last week, drawing the ire of musicians wary their content would be used to train generative AI models. Similar to SoundCloud’s initial statement last week, Seton said in the letter that the platform updated its terms of service “to clarify how we may use AI internally to improve the platform for both artists and fans,” citing functions like improved search, playlisting and content recommendations.
Seton said Wednesday, however, that the language in that update was “too broad and wasn’t clear enough.”
“It created confusion, and that’s on us,” Seton wrote. “That’s why we’re fixing it.”
Per the letter, SoundCloud is making another update to the terms of service saying “we will not use Your Content to train generative AI models that aim to replicate or synthesize your voice, music, or likeness without your explicit consent, which must be affirmatively provided through an opt-in mechanism.” The old language will be stricken.
Seton said that “if there is an opportunity to use generative AI for the benefit of our human artists, we may make this opportunity available to our human artists with their explicit consent, via an opt-in mechanism.”
“Our position is simple: AI should support artists, not replace them. Any use of these tools on SoundCloud will continue to reflect that,” Seton said. “AI is going to be a part of the changing landscape of music. It brings new opportunities, but also very real challenges. That’s why our approach will always be guided by a single principle: artist-first.”
SoundCloud’s move comes as AI remains one of the most contentious issues in the music and entertainment industries, as underscored by the concern following the ouster of copyright register Shira Perlmutter over the weekend.
While the change addresses some critics’ concerns, it hasn’t seemed to appease all the critics. Ed Newton-Rex, the founder of the nonprofit music advocacy group Fairly Trained (who said last week he’d be removing his music from the platform) tweeted Wednesday that the update “doesn’t go nearly far enough.”
“Their new terms will say they won’t train gen AI models that replicate your voice / style. But they leave the door open to the much more likely gen AI training: models trained on your work that might not directly replicate your style but that still compete with you in the market,” Newton-Rex wrote. “If they actually want to address concerns, the change required is simple. It should just read ‘We will not use Your Content to train generative AI models without your explicit consent.’”
Read Seton’s full letter here.
Sign up for THR news straight to your inbox every day
Sign up for THR news straight to your inbox every day
Subscribe for full access to The Hollywood Reporter
Send us a tip using our anonymous form.