What is extended reality (XR)?
Extended reality (XR) is a term that is commonly used to describe all environments and interactions that combine real and virtual elements. Whilst XR usually encompasses AR (augmented reality), MR (mixed reality) and VR (virtual reality), it has a more specific meaning when used in relation to film, broadcast, and live entertainment production. In this article, we explain how, and why it’s on course to become a studio staple.
Extended Reality meaning
When used as an umbrella term, XR denotes all AR, MR and VR technologies – it’s the overarching label given to systems that integrate virtual and real worlds. In this sense, XR can be applied equally to certain motion capture techniques, augmented reality applications, or VR gaming.
In the production world, however, it means something much more specific. XR production refers to a workflow that comprises LED screens, camera tracking systems, and powerful graphics engines. Often the LED wall operate with set extensions, which are tracked AR masks that allow the virtual scene to extend seamlessly beyond the LED wall.
How does XR production work?
In XR production, a pre-configured 3D virtual environment generated by the graphics engine is displayed on one (or across multiple) high-quality LED screens that form the background to live-action, real world events. When combined with a precision motion tracking system, cameras are able to move in and around the virtual environment, with the real and virtual elements seamlessly merged and locked together creating the combined illusion.
The benefits of XR production
Immersive production and real time edits
Immersive technology enables actors, hosts, and producers to see the virtual environments whilst shooting. This means that they can adapt their performances or make edits live on set, which reduces time (and budget) spent in post-production.
Lighting
Lighting is provided by the LED screens on an XR set. Common illumination helps real-world people and objects blend seamlessly into virtual environments, and further reduces time on set adjusting lighting.
No Colour Spill or Chroma Key Compositing
On certain green screen setups, colour spill and the need for chroma key compositing can increase the time spent in post production. Neither is required for XR screens which, again, reduces any time needed in post-production.
Rapid calibration
The calibration of camera tracking systems on xR sets takes minutes rather than hours (as can happen with green screen sets). This allows for scenes to be shot across multiple sessions with minimal disruption and preparation. However, where set extensions are used or AR objects are added, more precise calibration would still be required.
Examples of XR in production
The Mandalorian
Having experimented with XR sets in the making of The Lion King (2019), producer Jon Favreau used them to complete half of all scenes for his production of Disney’s The Mandalorian (2019). With a twenty-foot tall LED screen wall that spanned 270°, Favreau’s team filmed scenes across a range of environments, from frozen planets to barren deserts and the insides of intergalactic spaceships. Apart from giving the cast and crew a photo-realistic backdrop to work against, the XR set saved a significant amount of time in production. Rather than continually changing location and studio setup, the design team could rapidly switch out props and partial sets from inside the 75-foot diameter of the volume.
Dave at the BRIT Awards
At the 2020 BRIT Awards, an XR setup was used to enhance Dave’s performance of his single ‘Black’. In a collaboration between Mo-Sys, Notch, and Disguise, a 3D animation was projected onto Dave’s piano, giving live audiences around the world an engaging visual experience. With effective camera tracking provided by Mo-Sys Startracker, camera operators could move freely around the stage with zero movement or disturbance in the piano’s moving images.
HP OMEN Challenge 2019
With the rise of eSports, developers are exploring new ways to enhance gaming and bring immersive experiences to ever larger audiences. In 2019, HP did this by broadcasting their OMEN eSports tournament live from an XR stage. There were two main benefits of using extended reality: firstly, audiences around the world could immerse themselves in the virtual environments of the game; secondly, gamers in the studio could review their gameplay from ‘within’ the game. The end result was an interactive, immersive experience that blurred the lines between the real and virtual world.
Mo-Sys Academy is committed to sharing the latest developments in film and broadcasting technology with those looking to enter the field. If you’re interested in learning more, check out our previous articles explaining what a film crew does, and the difference between AR and VR broadcasting.