Unreal powers on-the-fly compositing with handheld camera
By Ken Pimentel
At NAB 2018 in Las Vegas, AMD partnered with ARwall to showcase a new use for real-time rendering in virtual production: an augmented reality wall that responds to changes in camera perspective, thus eliminating the need for compositing green screen footage with rendered CG in post-production. The ARwall Effects (ARFX) System, running on AMD hardware, was on display for all NAB participants to demo.
With the ARFX System, real-time images, computed with Unreal Engine, are displayed on a screen large enough to serve as a scene’s backdrop, and the images update in real time in response to the camera’s movement. When live actors stand in front of the wall, the result is an instantly “composited” shot — a real-time 3D virtual set extension.
“This is akin to what you'd do with a green screen, but now there's no green screen,” says Frank Vitz, Director of the Immersive Technology Team at AMD. “The camera is free to move around. You see your composite happening right there in the camera.”
With the ARwall system, the artifacts that can affect a green screen shoot are no longer a concern. “You don't have green spill. You don't have problems with hair,” says Vitz.
The ARFX System is an evolution of rear projection techniques, where a filmmaker projects a previously recorded film of the environment on a screen behind an actor during live filming. However, such techniques can only support a viewing angle perpendicular to the screen—any other angles immediately ruin the effect.
With the ARFX System, the camera can move and shoot at any angle, and roam anywhere in the virtual set with a hand-held camera for a fully immersive filmmaking experience. Perspective of near and far elements in the CG environment respond as expected in real time, with none of the limitations of a 2D projection. In addition, actors can now see and react to the environment, helping to enhance their performances.
“It's kind of the Holy Grail of virtual production to be able to make a seamless composite on the fly,” says Vitz. “This is amazing new technology that adds another tool to the toolkit for virtual production.”
To generate real-time updates to the background, ARFX uses Unreal Engine in conjunction with a set of trackers attached to the camera. The system can also employ AMD’s Advanced Media Framework (AMF) and IRT sensor to provide lighting information to Unreal Engine. A Zcam 360 camera collects lighting information which Unreal Engine uses to light the virtual set extension to match the practical set and respond to any lighting changes in real time. The system shown at NAB is powered by the AMD Ryzen Threadripper and two Radeon Pro WX 9100 GPUs.
"Filmmakers are the most demanding creators when it comes to realism and fidelity,” says Rene Amador, CEO of ARwall. “We knew that in order to make ARwall successful, we would need an engine that could handle whatever we threw at it, as well as the rigor and demands of a production environment. Unreal Engine has met these needs and more. Every update that shows up is like Christmas morning for the team."
Want to try out on-the-fly compositing and a whole lot more? Download Unreal Engine today.