Categories Technology

Spotify is finally taking steps to address its AI slop and clone problem

As AI music generators like Suno and Udio make it easy to churn out passable music, streamers like Spotify are getting flooded with AI-generated content, leaving audiences and artists alike to wonder what is “real†music anymore. On Thursday, Spotify announced new policies that attempt to tamp down on three problems the platform has with AI: slop, impersonation, and disclosing whether AI was used to create the music. 

The goal of the company’s efforts is to “protect authentic artists from spam and impersonation and deception†and make sure listeners don’t feel “duped,†Spotify’s global head of music product, Charlie Hellman, told reporters at a press briefing about the announcement. At the same time, he said that Spotify wants to let artists use AI if they want. 

Spotify is working with the music standards-setting organization DDEX to develop a new metadata standard for disclosing the use of AI in any part of song creation. This includes using AI to create the sounds that end up in the track, like vocals and instruments, as well using AI as an assistant during mixing and mastering the track itself, among other steps, Sam Duboff, head of marketing and policy, said to reporters.  

Fifteen record labels and music distributors have committed to adopting the AI disclosures, Duboff said. There is no specific timeline for when the standard will be released, Duboff said, adding that labels and distributors will need to update their process of delivering the credit information to Spotify. 

The platform also upped its response to impersonation, which it defines as including another artist’s voice — real or faked — without their permission, and which covers “unauthorized AI voice clones, deepfakes, and any other form of vocal replicas or impersonation,†Duboff said.  

Duboff also said that Spotify is also rolling out over the next few “weeks or a few months†a music spam filter that identifies uploaders who try to game the system. He mentioned several tactics commonly used by spammers, like uploading tracks just over 30 seconds long in order to “try to rack up royalty-bearing streams,†or uploading the same tracks many times with slightly different metadata. In the last 12 months, Spotify says that it removed 75 million spam tracks. 

The company also addressed rumors of AI-generated songs getting added to its playlists in order to avoid paying artists. Duboff called the rumors “categorically and absolutely false,†adding that Spotify does not create any music “with AI or without it,†adding that “100% of it is created, owned, uploaded by licensed third parties.†

He did not address whether AI music is being added to Spotify’s editorially curated playlists during the briefing. 

Later, Duboff provided the following comment to The Verge on the matter: 

“Our editors are focused on playlisting music that they think will resonate with audiences. And on tracks that appear to be primarily prompt generated, we’re seeing a very low level of engagement. But there is no truth to the conspiracy theories that we’re adding AI music to playlists or promoting AI music in any way for any financial benefit – all of the music on Spotify is licensed by third parties, we don’t create or own any of it, and we pay out royalties to third parties on all of it.”

Original Source: https://www.theverge.com/news/785136/spotify-ai-slop-impersonation-disclosure

Original Source: https://www.theverge.com/news/785136/spotify-ai-slop-impersonation-disclosure

Disclaimer: This article is a reblogged/syndicated piece from a third-party news source. Content is provided for informational purposes only. For the most up-to-date and complete information, please visit the original source. Digital Ground Media does not claim ownership of third-party content and is not responsible for its accuracy or completeness.

More From Author

Leave a Reply

Your email address will not be published. Required fields are marked *