SoundCloud changes policies to allow AI training on user content

SoundCloud seems to have changed his quietly Terms of use To enable the company to train AI on audio that upload users to its platform.
When spotted By technical ethicist Ed Newton-Rex, the latest version of the terms of SoundCloud contains a provision that gives the platform permission to use uploaded content to “inform, train, [or] develop ”Ai.
“You explicitly agree that your content can be used to inform, train, develop or serve as input for artificial intelligence or machine -intelligence technologies or services as part of and for providing the services,” read the conditions that were last updated on 7 February.
The conditions have a carve out for content under “separate agreements” with external legal horses, such as record labels. SoundCloud has a number of license agreements with Indie labels as well as large music publishers, including Universal music And Warner Music Group.
WAN could not find an explicit opt-out option in the settings menu of the platform on the internet. SoundCloud did not immediately respond to a request for comment.
Soundcloud, just like many large maker platforms, increasingly embraces AI.
Last year, SoundCloud worked with almost a dozen suppliers to bring AI-driven tools for remixing, generating singing and making adapted samples on its platform. In one Blog post Last fall, SoundCloud said that these partners would gain access to solutions for content -ID to ensure law holders [sic] Receive the correct credit and compensation “, and the promised” ethical and transparent AI practices that respect the rights of the makers. “
WAN event
Berkeley, Ca
|
June 5
Book now
A number of contentstosting and social media platforms have changed their policy in recent months to enable AI training in the first and third parties. In October, Elon Musk’s X updated its privacy policy to have external companies AI train on user messages. Last September LinkedIn changed its conditions to scrape the user data for training. And in December YouTube started to train third parties AI on user clips.
Many of these movements have led to a recoil from users who claim that AI training policy must be opt-in in contrast to opt-out, and who claim that they must be credited and paid for their contributions to AI training datasets.