Copyright Law

Alarming Rise of AI Music Copyright Laundering

AI music isn’t just flooding streaming platforms; it’s being hidden in plain sight. Now, a growing wave of “humanizing” services is helping creators turn AI tracks into royalty-generating “originals” with AI music copyright laundering.

by Bobby Owsinski via Music 3.0

I’m not so sure that we should be as angry at music generated by AI as much as the humans that use it. It’s bad enough that there’s a huge amount of AI music slop online, but now there’s actually a growing network of services to help users “humanize” AI music just enough so it pass as original, human-made songs. 

AI music copyright laundering.

When this “copyright laundering” happens, unscrupulous producers are able to bypass detection methods, register the song for copyright (fully AI generated songs are not eligible for copyright in the United States), and then collect royalties. All this without disclosing that the song is AI generated, provide attribution, or obtain any consent.

Humanize Is The New Buzz Word

The AI copyright laundering process starts with a generative AI tool like Sunoor Udio where the new track is generated. That’s when it gets further processed by:

  • Editing stems in a DAW to break up any robotic perfection
  • Adding random timing and “human” swing
  • Applying analog-style mastering to warm up the sound
  • Re-recording parts with hired musicians for authenticity
  • Changing bitrates
  • Inserting noise
  • Adding silence
  • Stripping AI metadata or headers 

Basically, anything you can do to alter the track in some way fools the AI detection on the platforms that actually care about this.

The cutting edge is currently called “model poisoning,” which injects inaudible “adversarial noise” into AI outputs where the track sounds unchanged to human ears. That then tricks the AI detectors into approvals.

Some companies even go as far as to advertise: “Upload your AI track, we’ll humanize it so you can copyright it.” Once processed, these tracks slide into streaming platforms, get registered with collection societies, and start generating income.

In the most elaborate cases, companies offer full humanization with real musicians performing the AI compositions, which are then marketed as “copyright-eligible” and “safe to register.” At least real human musicians are involved.

The Numbers Don’t Lie

Musician and researcher Benn Jordan found that of 560 top songs and staff picks on Suno, a full 98% were already monetized on streaming platforms, usually under fake artist names. One track had over 688,000 plays on Spotify, generating about $2,000 in royalties. This is money that would otherwise go to the human artists on the platform.

Meanwhile, Spotify’s artist count jumped from 10 million in 2023 to 12 million in 2024. With a reported 99,000 new songs uploaded daily across all streaming platforms, the sheer volume makes policing almost impossible.

Once a track is humanized, poisoned, scrubbed, and re-performed, it no longer looks like AI at all. By the time a distributor, DSP, or rights organization touches the file, any forensic trail that was originally available is now cold.

The bottom line is that this is no longer just about whether AI can be detected or not. It’s about an entirely new ecosystem built to game the streaming system. The music business has plenty of competition as is, it doesn’t need artificial artists piling on as well.

Bobby Owsinski is a producer/engineer, author, blogger, podcaster, and coach. He has authored 24 books on music production, music, the music business, music AI, and social media.

For More On This Topic: “How GenAI music is hacking the system – and getting paid” by Virginie Berger on Music Ally.

Share on: