Live Performance 2.0
By Alex Cohen from Berklee College of Music's Music Business Journal.
Although technology may have destroyed the recorded music market, it has played a significant role in enhancing live performances. Recent developments need to be heeded by both artists and concert producers.
4D Sound is an installation in Amsterdam where the sound system surrounds the audience unlike anything seen before. The system is set up with 16 pillars of speakers around the room, and more in the walls and floors. Sound immersion is the least of it. The stage setup is complemented with software that gives the performer complete control in the placement of sounds. It allows the artist to present new auditory arrangement performances beyond the control of just volume and timbre. With 4D Sound, listeners have reported that “sounds [fly] past their heads and [rain] from the sky” and “guitar riffs [come up] from their feet”.1 When performers can manipulate the sonic spectrum at will, and pinpoint parts of that spectrum selectively to audiences, live concerts will have changed forever.
Club promoters and music industry executives like cutting costs. Top musical acts downsize their orchestra, and single DJs who can sell out arenas are cheaper than rock bands. Technologists have recently taken it a step further.
Holographic performances take out the human element altogether. The artist is virtual: designed graphically, filmed, if necessary, with an actor in his or her place, and merged with film tape of the original artist; the right post production effects make the concert goer’s experience believable. The first time this technology went mainstream was at Coachella 2012. During the set of Snoop Dogg and Dr. Dre, deceased rapper Tupac Shakur ‘made an appearance’ as a hologram and performed along side his living colleagues. The digital Tupac looked bizarrely realistic and even interacted with the crowd and other performers.
Since then there have been more such performances. At the 2014 Billboard Music Awards, Michael Jackson, in hologram form, performed “Slave to the Rhythm” accompanied by real back up dancers and musicians. Many of the attendees were dear friends of Jackson’s and appeared to have found the performance unsettling (the estate of the deceased performance, of course, should have a say in the use of a dead artist’s likeness).
Holographic performances have not been restricted to resurrecting the dead. Hatsune Miku is the world’s first holographic pop star. She is not the recreation of a former artist, and her voice is digitally synthesized by an application created by Crypton Future Media. Although she is most well known in Japan, and is marketed to target Japanese youth, she has found global acclaim. On October 8th, 2014, Hatsune Miku made her debut on the Late Show with David Letterman. The virtual artist performed her J-pop song “Sharing the World” with a touring band of humans. Although these holographic performances were not without its critics, the amount of attention they have received so far suggests that the market for virtual artists will likely grow.
Music production platforms have started to incorporate performance aspects into their software. Ableton Live did this most successfully and became one of the most used DAW’s on the market. Unlike other music production software, such as Logic or ProTools, its sound engine is programmed to continue playback without overloading and crashing a computer’s central processing unit. Ableton integrates other performance products in a user friendly way. With Akai, they have created a number of hardware controllers that work with the software. Ableton allows mapping virtually any of the parameters in the program and changing them at will with a MIDI controller. A user can personalize the controllers, and the hardware setup. Other companies, like TouchAble and Lemur, have created apps to further enhance Ableton’s software.
iPads & iPhones, Gloves & Bracelets
Today, many artists perform on stage using iPads and, in some cases, even iPhones. Both allow real time control of instruments and effects in a live setting–so a performer can replicate, or modify at will, their songs on stage. Past restrictions are disappearing fast, for instance, with new controllers. Imogen Heap waves gloves to play melodies and change timbres in concert. She is happy that she no longer needs to react with faders, knobs, or keyboards, and that a wave of her hand or a particular bend of a finger gives her the desired effect.2
Companies are taking notice of such developments. Lightwave is pioneering wearable technology to read movement, audio levels, and body temperatures. A Lightwave bracelet created by Rana June, one of the first iPad DJs, is focused on reading concert goers’ physical responses and converting them into emotional metrics. Lightwave partnered with Pepsi, Fast Company, and Fools Gold artist A-Trak at SXSW 2014. A-Track made adjustments to crowd analytics on the fly –and so did the light crew, the fog machine operator, and the confetti blasters. The song set was even revised to suit Lightwave’s data. Moreover, a leaderboard showing the most engaged fan in the audience was projected on a screen behind A-Track. Finally, a “boys vs. girls” dance- off unlocked refreshments when a performer reached a given body temperature.3
By Alex Cohen
3. Lightwave “Pepsi Bioreactive Concert @ SXSW 2014” http://vimeo.com/lightwave
– See more at: http://www.thembj.org/2014/12/live-performance-2-0/#sthash.YuX3nCOk.dpuf