Extended reality in the live environment

Dec 09, 2020

Extended reality in the live environment

by Vincent Steenhoek, EVOKE Studios

Article first published by TVBEurope.

EVOKE Studios is a new venture which brings together established designers & engineers to extend what is possible on stage. Together with my colleagues Chema Menendez and Kristaps Liseks, we aim to deliver high-end design and workflow for performances and events – our goal is to create true visual chemistry on stage.

We work with some of the world’s leading creatives, on projects up to the very largest.

AIM Awards

We were approached by DesignScene, the production company for the AIM Awards (Association of Independent Music). We were already working on extended reality environments for them, and they proposed that we might sponsor the Pioneer Award, using the performance as a showcase for what can be achieved. We were excited to support independent music as well as seizing the opportunity to highlight what we can do.

Little Simz won the Pioneer Award, and hers was just one of the performances we captured for the AIMs. While each performance was pre-recorded, each was shot as live with no fixing in post, including transitions done in camera: the run that the performer liked was the one that was used.

So we had to get the extended reality elements right every time. That meant we had to balance taking calculated risks while exploring creative ideas, in order to make sure we deliver something exciting but practical, to a high standard of quality.

XR is a moving target. The technology is still heavily in development. The current challenges are ensuring the smoothness of tracked content, working out frame delays, and having a close to faultless set extension – graphics rendered in camera which extended beyond the physical set.

The physical construction itself may include virtual elements – LED panels for floors and walls, for example – to which the set extensions have to match precisely. Where the talent moves, the angles of shadows and levels of brightness between the LED screens and the rigged lighting all need a lot of attention. The illusion is instantly broken if there is any break between real and virtual.

For the AIMs Awards, we created a “honey I shrunk the kids” look for AJ Tracey and MoStack (creative direction by TAWBOX), surrounding them by larger-than-life burgers and fries. We blended the floor LED, the virtual elements and lighting to place the talent into the scenes as well as we could – even using old-school rotating gobos, to blend Little Simz into the underwater worlds. I was really pleased with the way we put the presenter and the show idents into a 3D environment and the parallax effect we achieved within that.

Planning a project like the AIM Awards means we need to do considerable testing and plan with enough contingency. It also means that we need the supporting technology – like camera tracking – to just work. It has to be completely stable, no questions asked.

That is why we used StarTracker from Mo-Sys. It is a set and forget solution which uses dots on the ceiling which light up under ultra-violet light – that’s why they call it StarTracker. Once the star pattern is captured, it is there forever.

We made the AIM videos at the new Deep Space studios at Creative Technology in Crawley. They have StarTracker set up on a jib, with all the connections to the lens and grip. What StarTracker gives us is a constant stream of extremely precise camera location in three dimensional space, along with camera rotation and – from lens sensors – zoom and aperture (for depth of field). We used its translation, rotation and lens data inside the graphics servers (from disguise) to drive virtual cameras in Notch.

Our experience with StarTracker is that it gives us ultra-reliable, highly accurate positional data with no encoder drift to speak of, and ultra low latency so we can create virtual elements and match them to the scene captured by the camera live. As XR consists of a lot of moving parts and separate calibrations, it helps a lot if there are parts of the process that are a constant.

For EVOKE as a creative studio and systems integrator, we are enabled to do what we do by companies like Mo-Sys. In turn, building on technologies like StarTracker enables awards shows like the AIMs to be presented in broadcast quality virtual environments.

Get the latest on Mo-Sys

Enter your email to get the latest news from Mo-Sys Engineering

2024 © Mo-Sys Engineering Ltd. All rights reserved
Start Typing and press enter to search