If you thought AI-generated videos were getting scary realistic, you now have one more worry to add to that list.
According to music streaming platform Deezer, an overwhelming majority of people cannot tell AI-generated music apart from the real deal written and performed by actual humans.
In a joint survey with market research company Ipsos, Deezer asked 9,000 people across eight countries —the United States, Canada, Brazil, the United Kingdom, France, the Netherlands, Germany, and Japan—to listen to songs and determine whether or not they were AI-generated. A whopping 97% of the respondents failed this task.
The participants in Deezer’s survey were split on how to view the findings, with 52% finding it uncomfortable that they were not able to tell the difference. 51% of the survey respondents also said they think AI will lead to more low-quality, generic-sounding (aka AI slop) music.
Regardless of how they viewed AI’s role in music, 80% agreed that AI-generated music should be clearly labeled.
“The survey results clearly show that people care about music and want to know if they’re listening to AI or human-made tracks or not,†Deezer CEO Alexis Lanternier said in a press release.
Labeling AI use in music is a hot topic. The conversation was sparked earlier this year, when a rock band called “The Velvet Sundown†amassed a million Spotify streams before it was revealed that the project was AI-generated. It led to increasing calls by artists for the clear labeling of AI use in music.
Spotify said in September that it will start supporting a “new industry standard for AI disclosures in music credits.†But a quick look at The Velvet Sundown’s artist page shows that there is no clear, upfront labeling yet.
Deezer clearly labels AI-generated content on its platform, but is still home to a growing number of AI-generated songs. The French streaming company announced in September that 28% of the music uploaded on its platform was fully AI-generated.
There’s a likely reason why AI-generated music was so hard to distinguish from the real thing in the Deezer survey, and it’s the same reason why artists are worried about the rise of AI-generated music: it’s because these AI song generators are trained on the hard work of actual human musicians.
“There’s also no doubt that there are concerns about how AI-generated music will affect the livelihood of artists, music creation and that AI companies shouldn’t be allowed to train their models on copyrighted material,†Lanternier said in the press release. Seventy percent of the respondents said they believed AI-generated music threatened the livelihood of musicians.
It’s unclear where copyright law goes from here in relation to AI and music. Early signs say the European Union might be siding with the artists. In a key case in Germany, a court ruled earlier this week that OpenAI’s ChatGPT had violated copyright law by training its models with song lyrics.
The story is different elsewhere, though.
Earlier this year, famous British musicians like Elton John and Dua Lipa called on the British government to pass an amendment that would ensure copyright transparency when it comes to how AI companies can use their work to train models. But that amendment ultimately failed.
AI-generated music is winning stateside as well.
In April of last year, leading artists from Billie Eilish to Aerosmith signed an open letter calling on AI developers and digital music services to pledge not to “develop or deploy AI-music generation technology, content or tools that undermine or replace the human artistry of songwriters and artists.†A few months later, leading studios Universal Music Group, Sony, and Warner Records filed a copyright lawsuit against two popular AI music generation startups, Suno and Udio.
But fast forward a year, Universal Music Group announced that it not only had an out-of-court settlement with Udio but was also partnering with the AI company to create a new product trained exclusively on their music catalogue.
Spotify is also doubling down on AI. The streaming giant already uses AI to optimize its algorithm and provides services like an “AI DJ†to mimic a radio host interjecting with commentary in between a personalized music stream. The company also announced last month that it was planning to collaborate with Sony, UMG, Warner Music, and others to develop “responsible AI products,†without yet divulging what exactly those products would be.
“AI is the most consequential technology shift since the smartphone, and it’s already reshaping how music is created and experienced,†Spotify’s co-president Gustav Söderström said in the press release. “Our company brings deep research expertise to this opportunity, and we’re actively growing our AI team and capabilities to drive the continued growth of the entire music ecosystem.â€
AI isn’t just coming for your Spotify playlists, though. The music industry is everywhere. The livelihoods of real, human musicians depend on not just hit albums but also catchy brand jingles, movie soundtracks, podcast outros, hold songs on phone calls, and a lot of other melodies that we take for granted as background music. In a world where AI takes over music, the jobs of these anonymous musicians who create the soundtrack of our everyday lives could be the first on the chopping block.
Original Source: https://gizmodo.com/people-cant-tell-if-a-song-is-ai-generated-and-thats-why-its-going-to-be-inescapable-2000684711
Original Source: https://gizmodo.com/people-cant-tell-if-a-song-is-ai-generated-and-thats-why-its-going-to-be-inescapable-2000684711
Disclaimer: This article is a reblogged/syndicated piece from a third-party news source. Content is provided for informational purposes only. For the most up-to-date and complete information, please visit the original source. Digital Ground Media does not claim ownership of third-party content and is not responsible for its accuracy or completeness.
