Two years ago, “BBL Drizzy” was the AI music shot heard around the world: a song with vocals that sounded like Drake bubbled up from nowhere and launched what was shaping up to be a battle of artistry, likeness, and of course, copyright. The big three labels — Universal Music Group (UMG), Sony Music Entertainment, and Warner Records — sued AI companies Udio and Suno for copyright infringement “en masse”; they staged public spats with TikTok over issues including AI content on the platform; and they began spinning up AI detection tools to keep tabs on how their music moved around.
Now the music industry and AI startups appear largely aligned on a (monetizable) path forward — and it looks a lot like the system artists are already stuck in.
“KLAY is not a prompt-based meme generation engine designed to supplant human artists. Rather, it is an entirely new subscription product that will uplift great artists and celebrate their craft,” the press release reads. “Within KLAY’s system, fans can mold their musical journeys in new ways while ensuring participating artists and songwriters are properly recognized and rewarded.”
According to a Financial Times report from October, labels were advocating for a compensation framework similar to how traditional music streaming works: micropayments based on plays. Everyone from independent artists to Taylor Swift have complained that the streaming-era payment system squeezes the people actually making the music, with profits funneling up to labels instead. Specifics of the Klay deals weren’t immediately clear, but one can imagine that pricing out earnings for AI-generated remixes could be much more complicated than streaming the original song: who gets paid, for example, when a user asks for a shoegaze-style remix of a Sabrina Carpenter song? And let’s say that shoegaze Sabrina Carpenter track generated by a user ends up going viral on TikTok, racking up millions of views — then what?
The ecosystem for AI-generated music is messy. Spotify said in September that it had pulled 75 million “spammy” tracks in the previous 12 months alone. One track removed by the streamer in recent weeks is “I Run” by the unknown artist HAVEN. that was propelled to virality via TikTok. Some users mistakenly credited the vocals to R&B artist Jorja Smith, and the track had 13 million streams before Spotify removed it. In September Spotify added a new policy against artist vocal impersonations. (Songs that are original compositions but sound like a real artist open up a whole new can of worms around a person’s right of publicity.)
The creators of the track told Billboard that they wrote and produced the song but processed vocals using Suno, which allows users to generate songs based on text prompts. Eventually, HAVEN. reuploaded the track, this time using human vocals instead of the Suno-processed Smith soundalike. Some listeners apparently preferred the AI version.
All of this makes for a potentially very weird future of music listening. AI-generated tracks falsely attributed to human artists with no licensing agreement will continue popping up, and labels will continue going after them. But if Klay and the big three labels indeed launch a remix platform, the officially licensed AI tracks will mingle on the internet with the black market AI tracks. Songs will be uploaded, pulled, reuploaded, and tweaked, a tangle of questions around ownership and compensation. With these deals, music labels are attempting to walk a line that could only get muddier: AI music based on our artists is fine, as long as we get paid.






