How ‘The Mandalorian’ Uses ‘Fortnite’s’ Game Engine for Filmmaking – Geek

There were a lot of cameos to catch during Disney+’s Mandalorian premiere last week. Prisoners frozen in carbonite, Salacious Crumb’s siblings on a spit-roast, an IG assassin droid, and a brief background shot of a secondary Mandalorian figure with armor almost identical to Boba Fett’s. But eagle-eyed production nerds may have noted a surprising real-world cameo in the episode’s credits: Fortnite developer Epic Games.

Fortnite is the new killer app among generation Z, adding a competitive edge to the building mechanics they learned in Minecraft. But Jon Favreau’s team isn’t crediting Epic because of on-set battle royales. Instead, The Mandalorian makes use of Epic’s other cash cow- its widely used Unreal Engine 4 game development engine.

Unreal Engine 4, or UE4 for short, powers not just Fortnite and other Epic properties like the fittingly named Unreal Tournament, but many high-budget “AAA” games across the industry. These include stylish games like Kingdom Hearts 3 and Dragon Ball FighterZ, and even Star Wars games like the recently released Star Wars Jedi: Fallen Order. And during a surprise on-stage appearance at the Unreal Engine User Group at the SIGGRAPH 2019 computer-graphics conference earlier this July, Mandalorian executive producer Jon Favreau explained how he’s using the popular game development tool for filmmaking.

Let’s start with how UE4 helps streamline existing filmmaking practices. “We used the V-cam system where we get to make a movie, essentially in VR, send those dailies to the editor,” explained Favreau. “and we have a cut of the film that serves a purpose that previs would have.”

There are two big takeaways from this quote. First is the “V-cam system,” which refers to a built-in tool for Unreal Engine called the “virtual camera plugin.” This system allows users to take the mock environments they’ve built with Unreal Engine 4’s more vanilla features, render them in real-time, and create customized virtual cameras to move about 3D space within the scene to record it live.

[embedded content]

This plays into our second takeaway, previs. Previs is filmmaking lingo for “previsualization,” and refers to any tool a director or cinematographer might use to plan a scene’s visuals before shooting. Usually, this involves painstakingly hand-sketched storyboards, which essentially means creating a mock comic book for an episode of TV before shooting. This lets the team plan out certain shots ahead of time.

Going onto set without storyboards can result in a haphazard, improvised look, not necessarily appropriate for launching a new major streaming TV service. By using Unreal Engine 4 to make CG mockup movies of the show, Favreau is able to give his team the equivalent of storyboards without needing to go through the arduous process of having them hand-drawn.

The team still needs to make 3D models to use in the engine, but because much of the final product is CG anyway, being able to use one library for both tasks reduces the scope of the project to a more manageable level. Additionally, Unreal comes built-in with basic shapes, free downloadable props, and even a landscape editor that is often enough for many projects’ previs purposes.

But Unreal’s work in The Mandalorian goes beyond replacing storyboards. Because so much of the show relies on CG backgrounds, Favreau was able to use the game engine to provide much-needed context for his live actors.

This comes through the use of LED walls, which Favreau originally only intended to use as either interactive lighting or dynamic moving greenscreens. However, when he noticed that Unreal Engine can render the film’s environments in real-time and display them on the walls, he saw an opportunity to situate his actors in an alien setting as opposed to a dry studio.

“For the actors, it was great because you could walk on the set, and even if it’s just for interactive light, you are walking into an environment where you see what’s around you,” Favreau stated. “And it would fool people. I had people come by the set from the studio who said, ‘I thought you weren’t building this whole set here,’ and I said, ‘No, all that’s there is the desk.’”

[embedded content]

This makes for a significant step forward from the awkward behind-the-scenes footage of the prequels, which frequently features actors in front of a blank blue or green tarp, and nothing else.

In fact, the LED walls were so convincing that, aside from helping the actors, their real-time in-camera renders often also worked for the final shot.

“We got a tremendous percentage of shots that actually worked in-camera, just with the real-time renders in engine, that I didn’t think Epic was going to be capable of.”

Most CG filmmaking uses pre-rendered footage, which can’t be manipulated after creation and can take up to seven hours to render a single frame. Being able to use footage rendered in-real time allows for more flexibility in camera shots (and the visualizations presented for actors on LED walls), as well as significantly speeds up the process.

One of my biggest concerns going into The Mandalorian was whether it could live up to Star Wars’ special effects pedigree on a TV budget. But after seeing sequences like Episode 2’s sandcrawler fight, I’m more than convinced. As a critic, I’ve scrutinized games made with Unreal Engine 4, and as a game developer, I myself have worked with the engine’s myriad tools before. I’m more than impressed by its ability to deliver anything from AAA bombast to smaller, more subdued poetic contemplation.

We live in a time where the boundaries between art forms are crumbling. Game design programs frequently teach architecture and film lighting to make their spaces more believable and thematic. For film to borrow elements from game design is just keeping in the Star Wars tradition: using every tool available to set a new industry standard.

Please follow and like us:
YouTube
YouTube
Instagram