Kristin Baver, writing at StarWars.com, takes a closer look at the innovative in-camera visual effects technology developed by Jon Favreau and ILM for Disney+ series The Mandalorian.
When The Mandalorian opens on a frigid ice planet, the titular bounty hunter zeroing in on his next prize, you can almost feel the gusts of wind howling past the unforgiving terrain. The effect is visceral, a seamless marriage of tried-and-true practical sets and costuming coupled with an innovative new technique that brings visual effects to the forefront of the production process.
Behind the scenes on the set of The Mandalorian, the storytellers and unprecedented visual effects engineers creating the first live-action Star Wars television series collaborated to crack the code for what has become a game-changing creation: StageCraft, a technological marvel that immerses the cast and production crew inside their CG environments in real time with the help of a massive wraparound LED screen.
“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of real-time, in-camera rendering,” showrunner Jon Favreau has said.
“Jon Favreau found the breakthrough that George [Lucas] was always looking for when he first explored the idea of a live-action TV show,” adds Richard Bluff, the visual effects supervisor on the acclaimed series. Known as StageCraft, the innovators at Industrial Light & Magic, Lucasfilm, and their collaborative partners on the project have achieved the planet-hopping magic of Star Wars storytelling that transports viewers to the galaxy far, far away by settling The Mandalorian in a largely computer generated, photo-real environment that wraps around physical sets and real actors to create a seamless effect.
It’s staggering to watch the final effect unfold, as a team of artists and engineers known as the Brain Bar act as mission control just a few yards away from the Volume, a curved cocoon of glowing LED screens ready to transport those standing inside of it literally anywhere. “It’s exactly the same sort of technology as the large LED screens you see in Times Square,” Bluff says. “What we wanted to do was shoot on a small stage with small physical sets that could be wheeled in and out fairly quickly and then extend those physical sets on the wrap-around LED wall.” And by moving visual effects to the beginning of the filming process, StageCraft enriches the performance of the actors, and the experience for directors and cinematographers using the new methodology for more precise storytelling in a completely fabricated galaxy.
It was just about six months before filming began that showrunner Jon Favreau, executive producer Dave Filoni, and DP Greig Fraser joined forces with ILM, Epic Games (maker of the Unreal Engine), and production technology partners Golem Creations, Fuse, Lux Machina, Profile Studios, NVIDIA, and ARRI to unlock this innovative achievement. While ILM had pioneered virtual production tools and had worked successfully with LED technology on previous Star Wars films, StageCraft was still very much in its infancy at the time, a virtual reality platform that helped storytellers scout fabricated environments to set up their shots. Propelled by the support of Lucasfilm president Kathleen Kennedy and the sheer will of Favreau, who was always pushing the collaborative team to try new things and bringing together some of the brilliant minds capable of making it happen, the crew took their next steps into the larger world by crafting the prototype of the Volume.
Fresh off projects like The Jungle Book and The Lion King, Favreau was passionate about employing new technology to enrich storytelling as he began his work on The Mandalorian. But the scheduling constraints of a television show — and a Star Wars show that had to satisfy the planet-hopping scope fans have come to expect while making it feel entirely authentic and accessible no less — meant whatever the team came up with had to appear realistic and be able to be shot on a Los Angeles soundstage without the traditional challenges of location shooting. “One of the things we wanted to do is move away from green screens and make the scale of a Star Wars TV show work,” Bluff says. “And we knew that we needed a technological innovation to push the boundaries and provide a solve for the production. Through the collaboration with Jon Favreau, Greig Fraser, ILM, Epic Games, and others we landed on the idea of utilizing video wall technology.”
Read the article in full at the link below.