Music Business

How Google Music Works: Somebody Tell Jimmy Iovine They Don’t Just Rely On Algorithms!

Golem-chesster-jiuguangw-flickrAs it turns out even Google, possibly the company most associated with removing humans from decision making systems, is using human curation in its development of the Google Play Music All Access streaming subscription service. Insiders recently explained how Google combines algorithmic decision making and human insights to power its music experience.

It's a good thing nobody really cared that Jimmy Iovine misrepresented objective reality in his early hype tour for Daisy, the digital music flower that has yet to bloom. Not only was he wrong then but he's ever more wrong as new services emerge that are finding their own combination of machine and human curation.

Google, though generally associated with algorithmic solutions for defining search results, actually has a human element. Google says that's used sparingly for things like child porn and spam while SEO professionals have long pointed to results that suggested otherwise. In any case, they aren't new to human intervention, it's just cheaper to let the machines sort things out if you can.

John Paul Titlow spoke with Google researcher Douglas Eck, who heads up the machine learning aspect of All Access, and Tim Quirk, who heads the editorial side.

Google's Music Machines

Titlow describes the role of computers:

"Eck's team is focused on the technical side of this equation, relying on a dual-sided machine learning methodology. One component of that is collaborative filtering of the variety employed by Netflix and Amazon to recommend horror flicks and toasters. The other involves machine listening. That is, computers 'listen' to the audio and try to pick out specific qualities and details within each song."

Collaborative filtering, which leads to music choices based on what a lot of people decide to listen to, is augmented by the user's personal history which ultimately leads to better music choices.

Machine listening can be related to Pandora's human scrutiny of music that results in descriptions of elements that allow Pandora to pick a track based on similarities between songs. Except that Pandora's human approach allows for an understanding of context and subtler distinctions than math and machines can currently handle.

Google's Human Curators

Obviously Google doesn't want to go through and have humans try to describe every song. Currently it's involving humans, led by Tim Quirk, in such tasks as choosing the top tracks in a genre and creating curated playlists. But they're not using the information to directly affect the algorithms:

"To the extent that these manually curated parts of the service influence its users' listening behavior, the human intelligence does find its way back into the algorithms. It just loops back around and takes a longer road to get there."

Titlow also describes his experience of listening to Pandora versus listening to Google's Radio mode and finds Pandora to come out ahead. For now, at least, the human factor seems to be winning the day. No wonder Jimmy Iovine tried to claim "first."

[Thumbnail image of "Georgia Tech's Golem Chesster" courtesy Jiuguang Wang.]

Hypebot Senior Contributor Clyde Smith (@fluxresearch/@crowdfundingm) also blogs at Flux Research and Crowdfunding For Musicians. To suggest topics for Hypebot, contact: clyde(at)fluxresearch(dot)com.

Enhanced by Zemanta

Share on: