Creatives and Directors: What Can You Achieve with Virtual Production?
What you can achieve with Virtual Production - Mo-Sys Engineering’s commercial director Mike Grieve on how virtual production can elevate creativity and save resources
What you can achieve with Virtual Production
In the virtual world, possibilities are endless. Unlike real sets where you’re limited to the physical attributes of set design, virtual sets are built in Unreal Engine where you can be anywhere and have anything. Creatively, it breaks you free from budget, time and location limitations.
Stop Fixing in Post, Solve It in Pre
One of the key attributes of virtual production is the ability to pre-visualise everything before you even get to set. You can “walk” through your virtual scene and make decisions on where the best camera angles are, change lens types and adjust lighting. And with everyone from the director and producer, to the cinematographer and VFX supervisor having the ability to be together, looking at the same 3D scene from anywhere in the world, decisions can be made far more quickly and easily. So when you turn up on the day, all you need to do is light and shoot.
You don't get that level of foresight on a physical shoot. Virtual production swaps basic preparation and fixing things in post, for high level prep by solving things in pre-production.
Not only that, but now that talent can actually see the virtual set around them – using an LED volume - rather than imagining where they need to look and interact using a green screen, you can shoot far more accurately. This helps avoid errors on things like eyelines between talent and virtual elements.
When you look at the whole production process, from pre-production to the actual deliverable, virtual production shrinks the overall production time and costs by reducing post-production needs. The bottom line is, it's better to solve problems in pre than try to fix them in post.
Production company: Made Brave in partnership with Quite Brilliant and Arts Alliance at Garden Studios
Shoot In-Camera Effects in Real-Time
The quality of the 3D scene created for a virtual production shoot, is always very, very good. But when the scene is loaded into the computer stack running Unreal Engine, and camera tracking is attached, the scene, more often than not, doesn’t play back in real-time. This is because the scene can’t be processed fast enough.
When this happens, the scene needs to be ‘optimised’, which is a bit like video compression that shrinks down the file size. When the processing load goes down, the frame rate comes up allowing the scene to play back in real-time and the real-time VFX shoot to happen.
The problem then is that the quality level of Unreal scenes is fixed. Because if you try to add any more quality, the frame rate drops below real-time and you can't shoot in-camera effects. This is a well known problem.
What normally happens is that a director or producer will then need to decide which shots will need to go to post-production for compositing to increase the quality of the background. That takes time and money. But not only that, it actually goes against the whole principle of virtual production which aims to cut down compositing time as much as possible.
At Mo-Sys, we’ve patented a solution to this called Neartime. It’s a service that runs in parallel with a real-time VFX LED shoot, that auto re-renders the background virtual scene at higher quality, enabling it to be composited back together with the keyed talent, so you can deliver a much higher quality product in the same delivery window.
So as soon as you start the camera to do the first shot, all of the tracking and lens data from the camera is thrown up into the Cloud, where that same Unreal scene that you're shooting exists on 50 to 100 servers. Then, all the quality dials are wound up and each take is re-rendered out sequentially as the real-time shoot goes on. It allows you to deliver higher resolution background graphics, faster and automatically, to save money and time.
Production company: Made Brave in partnership with Quite Brilliant and Arts Alliance at Garden Studios
Embrace Change and Dive In
As virtual production is still fairly new for most creatives and directors, there is an element of getting used to new ways of working. Things like lighting are handled differently on a virtual set, for example. When you've got real talent lit by hard and soft lighting, and the LED wall with different lighting characteristics displaying the background scene, it all needs to match in order to look like part of the same set viewed from the camera perspective. Fortunately on-set colour grading is about to get a boost, which will be ‘music’ to cinematographers who have already shot in a LED volume.
At the moment, the biggest challenge lies in the quality of the Unreal scene. When you go into a virtual set, there are two types of video that you display on the LED wall. One of them is video plate playback which is used for things like car scenes where the vehicle is moving quickly down a street. The car is static in the virtual studio but the video is moving. Those scenes are very high quality because they are shot with multiple high quality cameras on a rig designed to capture a rolling 360 degree view.
But then you have the Unreal scene using virtual graphics. This is where you need camera tracking on the real camera to match it to the virtual scene displayed on the wall. The quality of these virtual graphics is very good but it’s not quite as good post-production compositing just yet. This is where our NearTime technology can help.
And finally, you’ve got the challenge of continuity when changing elements or editing Unreal scenes live on set. Imagine you're on a virtual set and suddenly you decide that you want to move one of the objects on the LED volume to the other side of the screen. When you change something, you need to log what you've changed as it always has a down-stream impact on the shoot, and it can cause issues if you need to then remember what other scenes need updating as a result. This is something Mo-Sys is working on solving very soon, with technology that allows on-set real-time editing of Unreal scenes that automatically captures and logs the revisions. Watch this space!