Music Business

AI Generated Songs Exploit Deceased Artists [CHRIS CASTLE]

AI generated songs are being uploaded under the names of deceased artists without permission, and major platforms like Spotify and TikTok are letting it happen.

AI Generated Songs Are Exploiting Deceased Artists. The FTC Must Act

Op-Ed by CHRIS CASTLE via Music Tech Policy

In a disturbing twist of technological overreach, AI-generated music and other tracks are now being released under the names of deceased artists—without permission from their estates—and distributed on major platforms like Spotify and TikTok. These fakes appear alongside legitimate catalogs, misleading fans and tarnishing artistic legacies. The deception is not hypothetical. It’s already happening.

Reports have surfaced of AI-generated songs published under the names of artists like Blaze Foley and Guy Clark, apparently uploaded via TikTok’s SoundOn platform and pushed to Spotify’s verified artist pages. The result is a digital impersonation—profitable for the uploader, invisible to most fans, and deeply disrespectful to the memory of the artist.

And worse: the estates of these artists were never consulted. They weren’t even notified.

SPOTIFY AND TIKTOK ARE NOT INNOCENT BYSTANDERS

This isn’t just a glitch in the system. It’s a product of deliberate platform design.

Spotify evidently relies entirely on “trusted” distributors like SoundOn to provide metadata—but fails to verify whether the artist is even alive or whether the rights are legitimate. When someone uploads a fake track, Spotify’s system doesn’t flag it. It routes it straight to the artist’s verified profile—right next to their real work. That’s not a neutral mistake. That’s platform-enabled impersonation.

TikTok, through its SoundOn pipeline, makes it easy to upload and monetize content without verifying ownership. That’s how these AI fakes are getting in. Why is the security so bad? You don’t think it’s about the money do you?

AI Generated Songs

And let’s be honest: this entire setup is presumably profitable for Spotify and TikTok. If it weren’t, why would they continue operating this way? Every fake stream still generates Spotify’s famous 30% vig. Every fraudulent upload still drives engagement. Each one helps fuel the algorithm, creating more activity, more recommendations, and more clicks. The platforms benefit from the fraud, even if the true artists don’t.

To make matters worse, the estates often have no way of knowing this is happening–because if Spotify notified them, then Spotify would be denied the revenue opportunity. Unless a family member or estate representative happens to maintain a Spotify for Artists account—and happens to check it—they may never realize that fraudulent new tracks are appearing under the artist’s official name. There’s no alert. No approval request. No safeguard.

This is not a system that can be trusted.

APPLE PROVES THIS IS A CHOICE, NOT A LIMITATION

Unlike Spotify or TikTok, Apple Music seems to have avoided these impersonation scandals—likely because it works only with a curated group of distributors who are required to verify uploader identity and rights. That basic diligence makes a world of difference. It proves that this isn’t a technical problem—it’s a business decision.

Apple protects artists. Spotify and TikTok profit off them—even the dead ones.

NOT JUST ABOUT ROYALTIES—ABOUT FRAUD

The estates harmed by these impersonations aren’t asking for a royalty check. They’re asking for the fraud to stop. 

And they’re right to. If Spotify and TikTok allow their platforms to be used for impersonating dead artists, with full knowledge of the pattern, it raises serious questions about their responsibility and intent. This isn’t merely a case of failing to moderate content—it could rise to the level of willful blindness, a concept recognized in law when companies deliberately avoid learning the truth.

While Spotify may take down the tracks if the estate notifies them, or more likely has an actual name of someone at Spotify to call–sorry, email–there’s no DMCA for trademark or right of publicity.

AI IS MAKING IT WORSE

As generative AI platforms continue scraping creative works—without consent or compensation—there’s a growing risk that these fake tracks will flood the internet, undermining both artists and consumer trust. What does it mean to list a track under an artist’s name if the platform itself can’t vouch for its authenticity?

If these abuses aren’t stopped now, AI-generated fakes could become a permanent fixture of the music ecosystem—especially for artists who can no longer speak for themselves.

A CALL TO THE FTC: INVESTIGATE AND ENFORCE

This is where the Federal Trade Commission (FTC) must step in. The agency has a mandate to protect the public from unfair and deceptive practices—and this qualifies on both counts.

The FTC should investigate:
– Whether Spotify or TikTok knowingly facilitated impersonations;
– Who profited from the AI-generated fakes;
– Why safeguards like Apple’s have not been adopted;
– Whether these practices violate consumer protection law or constitute deceptive business conduct.

The FTC has the power to require changes to distribution and verification processes—and to set a precedent that digital impersonation of artists, living or dead, will not be tolerated.

CULTURAL LEGACY IS ON THE LINE

This issue goes beyond music. It speaks to the future of digital identity, consent, and trust in the age of generative AI. Deceased artists can’t protect themselves. Their estates are often the last line of defense. But they shouldn’t have to fight platform by platform, fraud by fraud in a whole new game of whack-a-mole simply because the dominant platform fails to police its systems.

And they certainly shouldn’t be expected to patrol Spotify themselves, hoping to catch unauthorized uploads before fans or press discover them.

It’s time for the FTC to act—and to make clear that the legacies of America’s artists are not up for exploitation.

Share on: