SoundCloud says it hasn’t used music uploads to train generative AI models. But related debates are ongoing after the platform seemingly left the door open to training undertakings down the line. Photo Credit: SoundCloud
Former Stability VP of audio Ed Newton-Rex, a longtime musician who’s currently the CEO of Fairly Trained, recently posted about that update on social media. Added in February 2024, per Newton-Rex and SoundCloud itself, the relevant text authorizes the platform to use uploads “to inform, train, [and] develop” AI.
“You explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services,” the terms read.
The next paragraph elaborates that “neither SoundCloud nor any third party” can use or reproduce uploaded media “for the purposes of informing, training developing (or as input to) artificial intelligence technologies without authorization from the applicable rightsholders.”
Despite the latter line, which seemingly leaves the door open for AI training should uploaders fail to opt out, the text isn’t sitting right with artists. Moreover, SoundCloud (which isn’t a stranger to artificial intelligence) left the same door open when responding to the controversy.
“The February 2024 update to our Terms of Service was intended to clarify how content may interact with AI technologies within SoundCloud’s own platform,” the Musiio owner emphasized on X. “Use cases include personalized recommendations, content organization, fraud detection, and improvements to content identification with the help of AI Technologies.”
When closing out the message, however, SoundCloud said it would “keep our community informed every step of the way as we explore innovation and apply AI technologies responsibly.” And subsequently, communications head Marni Greenberg indicated that “clear opt-out mechanisms” would accompany any future decision to train AI models on user uploads.
“Should we ever consider using user content to train generative AI models,” Greenberg relayed on this front, “we would introduce clear opt-out mechanisms in advance—at a minimum—and remain committed to transparency with our creator community.”
Just scratching the surface, many on X criticized SoundCloud for (among other things) possibly deciding against an opt-in model.
“The ‘opt-out’ they commit to in this new statement would be totally unfair on artists, shifting the burden onto them to tell SoundCloud not to train on their music,” Newton-Rex wrote. “Most would miss the chance. Opt-outs are designed to gather as much content as possible for training.”
Driving home his dissatisfaction, Newton-Rex committed to “[d]eleting my SoundCloud” profile, which appeared to still be live on the service at the time of writing.
The decision might be a bit premature at this stage of the game – especially given that some AI companies are alleged to have already trained on protected works without permission. But it’ll be worth closely monitoring the situation (referring not only to SoundCloud’s terms, but those of different DSPs) moving forward.
Share on:
You must be logged in to post a comment.