Skip to content
AI

Why I Built a Tool to Stop AI From Scraping Musicians’ Songs

An OpEd courtesy of a musician and producer sick of having his work gobbled up by AI, and looking to reclaim power for indie artists everywhere.

By Josh Stewart of iCloak

I didn’t set out to build a "tech product."

I’m a musician. Like most artists, I’ve spent years learning how to write better songs, record better takes, and slowly build something resembling a career. My relationship with technology has always been practical. Tools exist to help me make and release music, not to replace the joyful act of making it.

That changed when I realized how quietly and aggressively musicians’ work was being absorbed into AI training pipelines.

Today, vast amounts of recorded music are scraped, ingested, and used to train machine learning models without the knowledge or consent of the people who made it. This is often framed as inevitable progress, or dismissed as too abstract for individual artists to meaningfully respond to. But for those of us whose recordings are our labor, our income, and our voice, it feels like a fundamental breach of trust.

"I didn’t build it because I think technology is the enemy. I built it because artists currently have very few practical ways to assert boundaries in an environment that treats all public audio as fair game."

The industry conversation around AI often jumps straight to outcomes sold to us under the guise of progress: better tools, faster workflows, new creative possibilities, and any other euphemisms for removing friction. What gets skipped is the most basic question of all: who gets to decide how our work is used?

As it stands, artists usually don’t.

Most musicians don’t wake up wanting to “opt out of AI” (though I’m sure a large majority would). What we want is agency. The ability to say yes, no, or not yet. The ability to experiment on our own terms, rather than having our past work quietly repurposed to power systems we didn’t agree to participate in, that now pose the very real threat of displacing us.

The uncomfortable truth is that consent has not been treated as a prerequisite for innovation in this space. Instead, it’s often reframed as an obstacle. If a dataset is large enough, if the scraping is automated enough, if the legal footing is ambiguous enough, then individual creators become easy to ignore.

That’s the context in which I built iCloak.

iCloak builds on earlier research in this space, particularly work like HarmonyDagger, which explored how targeted audio perturbations could interfere with machine learning systems. My contribution wasn’t inventing the idea from scratch, but taking those concepts and turning them into something musicians without technical expertise could use.

iCloak adds imperceptible noise to audio files in a way that interferes with AI training systems while remaining inaudible to human listeners.

I didn’t build it because I think technology is the enemy. I built it because artists currently have very few practical ways to assert boundaries in an environment that treats all public audio as fair game.

There’s a tendency in tech culture to frame resistance as fear. If you question the direction of AI development, you’re painted as anti-progress or nostalgic for a past that’s already gone. I don’t think that’s fair, or accurate.

+Read more: "The Sky Isn't Falling: Why AI Music Is the Next Great Democratization of Creativity"

The musicians I know aren’t afraid of new tools. They’re afraid of being erased from the value chain. They’re afraid that the recordings they poured themselves into will be used to train systems that ultimately compete with them, without credit, compensation, or control.

Sadly those concerns aren’t hypothetical, they’re playing out at present moment.

What’s missing from many AI debates is any meaningful acknowledgement of power imbalance. Large companies can scrape at scale. Individual artists cannot meaningfully monitor or enforce how their work is used. Telling musicians to “just license their data” assumes leverage they simply don’t have.

So iCloak isn’t a protest against AI. If anything, it’s more of a response to asymmetry, a way to give artists some agency in how their songs are used.

"The musicians I know aren’t afraid of new tools... They’re afraid that the recordings they poured themselves into will be used to train systems that ultimately compete with them, without credit, compensation, or control."

I don’t believe tools like this should have to exist. In a healthier ecosystem, consent would be baked into the system, not bolted on by frustrated musicians. But until that changes, artists need practical options, not just thought leadership and panel discussions.

If nothing else, I hope tools like iCloak force a more honest conversation. One that acknowledges that progress without consent isn’t neutral, and that protecting creative labour isn’t anti-technology.

It’s pro-artist.

And artists deserve a seat at the table when the ‘future of music’ is being built using their life’s work.

Give iCloak a try if you're curious to see it in action.