D.I.Y.

How to make money from song stems with AudioShake’s Jessica Powell

CEO and co-founder of AudioShake, Jessica Powell talks about how keeping parts of your songs separate can open doors to endless opportunities with sync placements, remixing, and more.

by Byta

Digital Dialogue is an interview/guest blog series presented by Byta, and written by friends of Byta. Exploring niche, behind-the-scenes topics within the digital realm of the music ecosystem, Digital Dialogue presents readers with insights into challenges, successes and passion topics in the day-to-day life of those working deep in digital.

Why the Future of Audio Depends on the Parts of Your Song – Jessica Powell (AudioShake)

Jessica Powell

Jessica Powell is the CEO and co-founder of Audioshake, which uses AI to separate recordings into instrumentals and stems so that they can be used in sync licensing, remixing, and remastering, as well as emerging immersive, education, and social media formats. The company counts all three major label groups among its customers, as well as top publishers like Concord, Reservoir, Primary Wave, and Hipgnosis; iconic indie labels like Sun, Cherry Red, and Mute; and production music libraries, indie artists, and producers.Powell spent over a decade at Google, where she sat on the company’s management team, reporting into the CEO, and ran the company’s global communications organization. She began her career at CISAC, the International Society of Authors and Composers in Paris. She is also the author of the novel, The Big Disruption: A Totally Fictional but Essentially True Silicon Valley Story, and her essays and fiction have been published in the New York TimesTIMEWIRED, and elsewhere.

“By separating recordings into their parts and instrumentals–in mere seconds– an artist’s music has more opportunities to be listened to and enjoyed in new ways.”

Who are you? Where do you work?

Hi! I’m Jessica Powell, and I’m the Co-Founder of AudioShake, a start-up that separates the full mix of a song into its instrumental and stems, so that artists and labels can open up their songs to more uses in sync licensing, remixes, remastering, spatial mixes, and more. I’m based in San Francisco, California.

What are you currently listening to?

Fuerza RegidaJuana MolinaTerrell Hines… and whatever my coworkers are posting to the music channel on a given day in Slack.

Give us a small insight into your daily routine?

Start-ups are pretty wild and no day is the same. On any given day, I might be doing some combination of customer calls, managing payroll and bills, recruiting, working on a PRD or some other product management work, working on our website, or meeting with the research or engineering teams.

Over to you, Jessica

Why the Future of Audio Depends on the Parts of Your Song:

AudioShake sits at the intersection of music and tech, so one of the things we see pretty clearly is how much of our future music experiences will be built on the parts of a song (called “stems”). Music–and content more generally–will be “atomized,” creating entirely new ways for fans to interact with music and video.

Today, instrumentals and stems (e.g. the vocal or drum stem) are most often used in sync licensing, remixes, and remasters. Those will continue to be important revenue and creative opportunities for artists and labels going forward.

But we are also moving into a world where music will be all around us–not just in the ways we know today (e.g. sitting in our car or bedroom, listening to a track), but also via new formats and experiences that are beginning to emerge. Many of those new formats will be based on being able to pull apart and remix or re-arrange how, where, and when a song is heard.

For example, in social media, we already see artists releasing stems or audio that allows fans to interact and play with the song at a deeper level. This will only grow as social media platforms build tools to make it even easier for fans to remix and re-imagine songs–without needing to know how to use professional digital workstations.

In gaming and fitness, we’ll see the emergence of adaptive music–music that is changing in relation to your environment, and isn’t hard-coded into the game or platform. So your avatar could enter a scene, and each time the music might be different, and not only that, but the entire environment might have elements of that song.

In music education, fans will be able to play along with their favorite original songs–being able to isolate or mute e.g. the guitar or bass. In VR and metaverse-type worlds, the audio will be separated into stems and then spatialized, to create a more immersive experience. And already, people are working on things like stem NFTs. 

Of course, there are still a lot of challenges for all of this to work for artists. 

For one, a ton of songs are missing their stems–sometimes because of when the song was recorded, or simply because a hard drive crashed and session files were lost. In those cases, the music is largely limited to what can be done with the full mix. It’s particularly problematic in sync, where departments have told us they miss 30-50% of the opportunities that come to them because they can’t provide an instrumental in time–if at all. That’s the piece AudioShake is hoping to help solve. We have an Enterprise platform for labels and publishers, and a platform for indie artists, so they can create AI instrumentals and stems on-demand, and not miss out on any opportunities.

Second, if we are going to move into a world where remixing; mash ups; and algorithm-driven, interactive music experiences are the norm, then there has to be the right infrastructure in place for artists to be paid, and for stem use to be detected across the ecosystem. Remixes can be great marketing and promotional tools, but artists aren’t always paid for them, and remixers don’t see as much upside either. There is definitely a lot of room for improvement. 

Having said all that, I think it’s exciting to think that there will be more opportunities for music and artists in the future. By opening recordings up into their parts and instrumentals, an artist’s music has more opportunities to be listened to and enjoyed in new ways. 

Share on: