I first noticed something weird when a HEALTH album dropped on my Spotify new release radar. Except the cover design was funny — it didn’t look like a HEALTH album.
Some kind of AI slop had been uploaded to HEALTH’s artist page on Spotify, one of three fake albums that would appear under their name that weekend. The band’s X account made some jokes about it, the albums were eventually removed, and I went back to minding my own business. Then, the next weekend, I saw a new Annie album had dropped.
That album was more plausible — Annie had just released a new single, “The Sky Is Blue” — but when I clicked in, I couldn’t find it on the list of the song titles. Confused, I played the album and heard birdsong and a vaguely New Age-y instrumental. That… did not sound like Annie.
“That was upsetting to me, because if you have ears, you can definitely hear it’s not our music.”
So I did what any normal person would do: I bitched about it in my group chat. Which was how I heard that this was happening to other artists, like a lot of artists, and had been happening for months. (“I get one of these in my release radar often,” my buddy Gordon said.) For a while, metalcore artists such as Caliban, Northlane, and Silent Planet had been targeted. But so had a lot of artists with single-word names, such as Swans, Asia, Standards, and Gong. A new album would appear on an artist’s Spotify page, bearing their name but no similarity to their music. Sometimes, as with the fake HEALTH albums, they would disappear after a few days. Other times, they would linger indefinitely, even against the artist’s will.
“It was super weird,” says Marcos Mena, Standards’ lead songwriter and guitarist. “I thought, ‘Oh, this is something Spotify will take care of.’” After all, Standards has a verified artist page. But when a fake album was posted on September 26th, it didn’t budge. Mena emailed Spotify to tell them there’d been a mistake. The streamer responded two weeks later, on October 8th: “It looks like the content is mapped correctly to the artist’s page. If you require further assistance, please contact your music provider. Please do not reply to this message.” As of November 8th, the fake Standards album was still right there under the band’s verified, blue-checked name. It was finally removed by November 11th.
“That was upsetting to me, because if you have ears, you can definitely hear it’s not our music,” Mena told me. “It’s definitely a bummer because we did have a new album come out this year, and I feel like it’s detracting from that.” What if someone came to a concert where Standards was opening for another band, went to Spotify to check out more tunes, and got the fake album instead?
To me, this all raised an obvious question: fucking why?
This whole process effectively works on the honor system, and for something like the fake Standards album, this is where the problems begin
Given the history of scams on Spotify, I think the answer is money. (The answer is almost always money.)
To understand how this works, you need a sense of the mechanics. Streaming platforms like Spotify don’t work like your Facebook page — Mena and other artists aren’t logging in and adding albums to their accounts directly. Instead, they go through a distributor that handles licensing, metadata, and royalty payments. Distributors send songs and metadata in bulk to the streaming services. The metadata part is important; it includes things such as the song title and artist name but also other information, such as the songwriter, record label, and so on. This is crucial for artists (and others) to get paid.
But this whole process effectively works on the honor system, and for something like the fake Standards album, this is where the problems begin. A distributor takes you at your word that you are who you say you are, Spotify takes the distributor at their word, and boom, there’s a fake album on a real artist’s page. Most of the time when this happens, it’s an honest mistake. In the recent spate of fakes, though, it seems like artists are directly targeted.
Because the money an artist receives for streams goes through the distributor, the fake Standards album — should it get any payout at all — will reward someone other than Mena. We know the real Standards are on Topshelf Records, but the fake appears to be on something called Gupta Music, so Standards’ real label isn’t getting a cut, either. If enough people stream the album, the royalties will flow straight to Gupta Music… along with the payout from hundreds of other releases full of slop.
Even the supposed label name struck me as suspicious
Mena said he’d filed with his distributor to have the fake Standards album taken down. But whoever did it — Spotify or his distributor — didn’t notify Mena; I did, when I asked if he’d been involved in the removal. He wasn’t, he texted me, “but yayyyyy someone did something.”
Going to Every Noise at Once — essentially an encyclopedia of what’s on Spotify — and searching for Gupta Music, I saw more than 700 releases. The cover art looked remarkably similar and smacked of AI. The purported band names were mostly one word: “Rany,” “Living,” “Bedroom,” and “Culture.” The albums shared names with the faux bands. A search for “Gupta Music” returned only a 14-year-old TED Talk by a man named Robert Gupta.
Even the supposed label name struck me as suspicious. There’s a well-known marketing agency called Gupta Media. It reps entertainment companies including Disney Music Group, Republic Records (in service of The Weeknd’s album), and Sony Music.
It looks like Standards, Annie, HEALTH, Swans, and a number of other notable one-word artists were targeted directly. Spotify confirmed that the onslaught of AI garbage was delivered from one source, the licensor Ameritz Music. Ameritz Music did not respond to a request for comment.
“Due to significant and repeated violations of Spotify’s Metadata Style Guide, we ended our relationship with the licensor that provided the content in question,” said Chris Macowski, Spotify’s global head of music communication, in an emailed statement. “As a result, the content was removed from our platform.”
Macowski also said that Spotify “invests heavily in automated and manual reviews” to prevent royalty fraud.
But hundreds or thousands of songs with relatively modest streaming numbers, when combined, can lead to big payouts
Earlier attempts focused on metalcore musicians such as Fit for an Autopsy, Alpha Wolf, and Like Moths to Flames also seem like a coordinated attempt to siphon off legitimate streams. The culprit in that case was Vibratech Musicians, according to Idioteq.
Each individual payout for a song stream on Spotify is tiny, as legitimate musicians who use the platform frequently lament. But hundreds or thousands of songs with relatively modest streaming numbers, when combined, can lead to big payouts. A fraudster just has to upload music and find a way to make accounts play it. And this doesn’t just happen on Spotify. There are more than 100 streaming platforms where artists can run this scam.
In early November, Universal Music Group (UMG) sued Believe, a music distributor, and its US subsidiary TuneCore. In that lawsuit, UMG alleges Believe has a “conscious business strategy of indiscriminately distributing and purporting to license tracks with full knowledge that many of the clients of its distribution services are fraudsters.” The suit is about copyright infringement, and the details in it are striking.
UMG alleges that artists such as “Kendrik Laamar,” “Arriana Gramde,” “Jutin Biber,” and “Llady Gaga” are among those Believe uploaded — suggesting a strategy of attempting to capture streams from users who had simply typoed.
“People upload massive amounts of albums that are intended to be streaming fraud albums.”
Another strategy is creating AI covers of popular songs and getting them onto popular playlists so normal people will stream them. Another involves bots “listening” to songs.
Earlier this year, a Danish man was sentenced to 18 months in prison for using bots to get about $300,000 in royalties. Another man, Michael Smith, was arrested and charged with defrauding streaming services out of $10 million over the course of seven years. Smith used AI tools to create hundreds of thousands of songs under the names of fake artists such as “Calm Baseball,” “Calm Connected,” and “Calm Knuckles.” He then streamed the huge catalog using bots, billions of times, prosecutors allege. That diverted money that should have gone to real musicians that real people were really listening to. (In this case, Spotify paid only $60,000 to Smith, suggesting the company’s protective measures worked to limit payments, Macowski said.)
“People upload massive amounts of albums that are intended to be streaming fraud albums,” says Andrew Batey, the CEO of Beatdapp, a company that aims to prevent streaming fraud. Batey estimates that $2 billion to $3 billion is stolen from artists through this kind of fraud every year.
Distribution plays a big role. Most distributors’ business models are based on getting a cut of whatever royalties come back to the artists and labels. “Even though they may not be participating in the fraud, they directly benefit from it,” Batey says. In its suit, UMG alleges that Believe “wrongfully collects royalties it knows are properly payable by digital music services” such as UMG on copyrighted material.
“We’ve basically gotten lucky so far.”
A sophisticated fraud operation will use multiple fake labels and multiple distributors in order to avoid having a single point of failure. Besides bot accounts, a number of bad actors have access to real people’s compromised accounts. “They log in as you and me, play their song three times and leave,” Batey says. That fake stream is then hidden among all the real listening the real account is doing.
Gupta Music wasn’t the only label I found doing bulk uploading. There were three more doing something similar: Future Jazz Records, Ancient Lake Records, and Beat Street Music. All had also uploaded hundreds of albums with AI-looking album art. It’s unclear how these labels intended to generate streams, if at all. By the time of publication, most of those albums had been removed.
Problems with metadata have existed for years — some of them innocent, some considerably less so. “We’ve basically gotten lucky so far,” says Glenn McDonald, a former Spotify employee who runs Every Noise at Once. “The content validation system without any input on the artist level is fairly crazy.”
“The way it should have worked, getting the plumbing right, is that all those albums should have been flagged as new artists, and then it wouldn’t matter.”
When something goes wrong, there are two levels where it can be addressed: the streaming service and the distributor. Distributors have to strike a delicate balance. They make their money by getting a cut of the streaming payout. If they are too aggressive in policing the uploads, legitimate artists get caught. Fixing that is expensive, and distribution is a low-margin, bulk business, McDonald says. But allowing too many junk bands through creates problems with the streaming services.
As for the streaming services, they usually have data that could allow them to sort this out. If, for instance, the distributor that usually uploads Standards albums isn’t the one used for the new album, that’s the kind of thing that could be used to flag the album for review. (So is the change in label.) McDonald told me he also built tools for Spotify to identify when a song doesn’t sound like the rest of an artist’s catalog. Sometimes that can happen for legitimate reasons; an EDM remix of an Ed Sheeran song isn’t going to sound like Ed Sheeran, but it may still have happened with the label’s and artist’s approval.
Also, some legitimate artists share the same name, especially tiny indie bands, and they just have separate pages. “The way it should have worked, getting the plumbing right, is that all those albums should have been flagged as new artists, and then it wouldn’t matter,” McDonald told me.
Besides, AI is just an accelerant for a type of fraud that’s lived on Spotify for years
As for the distributors, the thing to keep an eye on is UMG’s lawsuit. A pretrial conference is scheduled for January. The outcome of the suit could potentially change how distributors filter the music people try to upload through their platforms — because if lawsuits are more expensive than content moderation, there’s likely to be more content moderation. That could improve things for Spotify, which is downstream of them.
Just banning AI-generated content from Spotify — or distributors — might feel like an intuitive solution. But despite the backlash against AI-generated media, there are legitimate AI-generated songs. For instance, “10 Drunk Cigarettes,” from Girly Girl Productions, is something of a hit. (It also is likely human-assisted, rather than wholly AI-generated.) UMG has made a deal with Soundlabs to allow artists to use Soundlabs’ AI vocals for themselves. It’s also partnered with Klay Vision to create a model for generating music.
Besides, AI is just an accelerant for a type of fraud that’s lived on Spotify for years, says Batey. Fraudsters used to dig up old, obscure albums and digitize them or slightly alter a song that already existed. AI has just cut down on the amount of work that’s required to make the fake song needed to get the streaming money.
At the same time… accelerants sure do make things burn down faster. Plenty of platforms have become less useful as they’ve been choked with AI glurge — Facebook, Instagram, the artist formerly known as Twitter, even Google itself.
AI music poses the same threat to Spotify, McDonald says. He points out that I had been waiting for the Annie album, excited for it, even. And then instead, I got duped into garbage. “There’s all these mechanisms around assuming this stuff is correct,” he says. But right now, those mechanisms are broken — and people who truly care, like artists themselves, don’t have their hands on the controls.
+ There are no comments
Add yours