How augmented reality put five Madonnas on stage at once
Friday, 03 May 2019
The avatars (not holograms, so they weren't visible to the naked eye) wove in and out of the inventive performance, bursting into butterflies and puffs of smoke. There were several environmental effects that livened up the show, including digital rain, clouds, greenery and splashes of color, which married with the physical side in an attempt to tell a cohesive story. Madonna has reinvented herself countless times over her storied career, so it's perhaps little surprise that she tried something like this.
Jamie King, Madonna's long-time creative director, said he was looking for something special for the BBMAs. 'After meeting with [Madonna's manager] Guy Oseary, we settled on the idea of incorporating augmented reality into the performance,' he told Engadget. 'I wanted to explore a way to involve her Madame X personas into the performance as well as the possibility of the real Madonna actually being able to perform with [them].'
The team brought the concept to a new creative AR company called Sequin, which took on the challenge of piecing the performance together. While it was the first time Madonna and Maluma performed the song live, it also marked the first project for Sequin.
While you might not recognize the name, you'll probably be familiar with the work of co-founders Lawrence Jones and Robert DeFranco. At The Future Group, their projects included those dramatic flooding visualizations for The Weather Channel, an AR-enhanced performance by K/DA at last year's League of Legends World Championship Finals and effects for this year's Super Bowl, for which they were nominated for an Emmy.
Jones, who oversees creative, production and technology development at Sequin, believes it was the first time there's been a broadcast AR performance using volumetric capture, which he called "the next revolution" of the medium. 'What's new about this is that it's a completely choreographed performance where Madonna and Maluma are dancing with four digital versions of Madonna in perfect choreography,' he told Engadget in an interview.
The show was something of a global affair. The volumetric capture process took place at a studio in London, while a Canadian company created the digital assets and environments in Unreal Engine for Sequin to pull together.
To read the rest of the article visit: uk.news.yahoo.com
Comments