The Future Audience Model: The Mechanics and Craft of a Multi-Platform Show

Building a successful entertainment experience today means going where the consumers are. And thanks to advancing technology, coupled with the shift towards virtual experiences accelerated by the Covid-19 pandemic, today’s consumers are demanding a range of viewing experiences. Almost half of the respondents (45%) said they were interested in the idea of seeing live music in the metaverse, while 38% of adults said they would likely attend a live sporting event, according to a survey conducted in the US by the Morning Consult.

The interest from users is unquestionably there. However, anyone who aspires to put on a mixed reality event has to consider that only a fragment of the market own virtual reality (VR) headsets. This will set different expectations across different user groups. For example, if you’re watching a streaming performance on a smartphone or through VR headset, the same feed isn’t going to be viable for both audiences. A curated, cinematic 2D video stream is very different from a 360-degree interactive view of the whole space that users can freely command.

These are only some of the considerations creators increasingly need to weigh when developing content for people around the world to experience through their preferred devices. That’s especially true as entertainers step into the metaverse with virtual concerts and performances. Artists like Lil Nas X, Justin Bieber, Ariana Grande, Travis Scott, and deadmau5 are among those who have performed in virtual gaming worlds like Roblox and Fortnite, reaching millions of fans globally in real time. 

Not only can virtual concerts reach a larger audience, but they also unlock interactivity features so that fans truly feel like they’re part of the experience. What’s more, there’s plenty of room to grow and experiment with new kinds of immersive performances, including those that span multiple platforms. 

Carry Me Home: a well-executed experiment 

Creatives are always experimenting; innovation is at the core of any performer’s work. Often, we have to take a leap of faith, to risk failure — and that’s something we did both literally and figuratively with the world’s first live mixed reality virtual circus performance. Shocap Entertainment, in partnership with Animatrik Film Design, created and delivered a digitally enhanced acrobatic display for a live studio audience that could also be live streamed anywhere in the world or experienced through a VR headset.

Carry Me Home, the first entry in the LiViCi (Live Virtual Circus) performance series with the celebrated Montreal-based artists collective The 7 Fingers is a blend of artistry and movement with an array of surreal digital enhancements. Artist Didier Stower wrote and performed original music, with lyrics interpreted in dance form and through digital art. The concept behind the show meant that songs were performed in multiple art forms on multiple platforms, both virtually and in real life. This is a great example of how the metaverse can enhance a real-world experience. Realizing this vision required advanced technology, and the creative use of new virtual production techniques. We paired movement from some of the world’s most talented acrobats with motion capture, bringing them into the virtual space and augmenting everything with an array of dazzling visual effects.

Motion capture for real movement

Motion capture enabled every nuance of the skilled tumblers’ flips, rolls, and gestures to be represented in the digital space, maintaining the human form in a way that can’t be artificially duplicated. The realtime nature of the show meant that mocap was the linchpin tying the physical and digital worlds together, allowing for the benefit of scene-stealing backdrops and effects brought by the virtual elements while also enabling the immediacy of live performance to be experienced by audiences around the world.

Even simply streaming an online performance in real time through a flat feed would be a sizeable challenge. And we weren’t only running a live performance for an audience in our Vancouver studio, but also making that simultaneously available through video and immersive VR.

This multi-platform setup was underpinned by numerous practical considerations, from designing mocap suits that look like costumes, to identifying when and where to use active versus passive mocap markers. Building a workflow that accommodated our talent while also ensuring that we produced a complete digital experience through their performances required tailor-made solutions in all aspects of the show.

The audience matters more than ever

Studios need to consider the viewing experience from every perspective: in-person, through the livestream, and via VR headsets. Pick any two of those and there are some clear commonalities — but few that apply cleanly to all three.

The in-person viewer would want to see the live performers while still getting the feel for the digital augmentations. The livestreamed view focuses on the digital experience, with producers choosing angles and framing for the best shot. But VR viewers demand full immersion with the ability to look anywhere they please.

As we started to bring the show together, we had to overcome some gaps between the approaches. For example, would our in-person audience be enthralled when the shot is more digital than physical in nature? What does the VR viewer see when we’re transitioning between shots? How do we maintain the connection to the show’s emotional core throughout?

It took tireless experimentation and collaboration to overcome these hurdles and unite the different viewing experiences in a cohesive, groundbreaking multi-platform performance. The result is the first of many planned LiViCi shows that blend physical motion with virtual storytelling techniques.

As virtual and mixed reality shows grow in popularity, it will become increasingly important to keep the spirit of live performance and find a way to marry the digital and physical worlds in new and innovative ways. The two don’t have to be mutually exclusive, but it takes considerable effort to make them work as a cohesive and compelling show that can be viewed across platforms.

Crafting a captivating mixed reality experience is all about amplifying immersion while maintaining real connections between users, and extending that connection to the artists and performers. The audience model of the future is being shaped by these innovative experiences that push the boundaries of storytelling, creativity and technology.