ETC Pushes Boundaries of Virtual Production with ‘Fathead’

“A world without synthetic humans is not an option anymore,” the audience at the 2022 HPA Tech Retreat in Rancho Mirage were advised by “Fathead” virtual production producer Tom Thudiyanplackal. Equally unimaginable is a world without synthetic sets, as was obvious at Tuesday’s daylong HPA Super Session, themed “Immerse Yourself in Virtual Production.” “Fathead,” the fifth short from the USC Entertainment Technology Center’s Annual ETC Innovation and Technology Grant, was all about in-camera VFX, captured in real-time on a stage lined and lidded with LEDs — 280-degrees around and on the ceiling, too.

Since Disney+ began using the technique two years ago on “The Mandalorian,” the approach is slowly becoming the norm, replacing the greenscreens and location shoots of yesteryear. While the transition to shooting on an LED stage, “the volume,” lends hyper-realism and certain conveniences, it is not without challenges. “The days of saying ‘We’ll fix it in post’ are over,” says ETC director of adaptive production Erik Weaver, who participated in several Super Session panels and helped program the day.

The project addressed questions of how to optimize workflow, essentially eliminating travel while minimizing the number of on-set participants. “Universal wanted to know, ‘How do we know how do we safely get production up and running in the era of COVID?’” Weaver explains. “You want to be able to test something like that out on something other than a $100 million movie or TV show. So we did a huge amount of research, investigating all the different protocols.”

“Fathead” is the ETC’s second virtual production, following 2020’s “Ripple Effect.” Shorter, and less complex, Weaver notes “Ripple Effect” “was the only project that got through USC that year.” Indeed, the 12-week shoot was one of the few in a locked-down Hollywood. “Fathead” will upon completion be an 8-month project, Weaver estimates. Since final elements are more or less being created in preproduction, the workstyle requires special rigor and intensity of planning.

“It’s in [preproduction] that you see how the production design and everything will fit, so you have the whole model before you go on the actual volume and start burning money and time,” says “Fathead” director’s assistant Victoria Bousis, founder and creative director of virtual production firm UME (pronounced you-me).

Set in a dystopian future, “Fathead” depicts the clash of two young groups — the Dums and the Ragamuffins — battling for control of a junkyard paradise, where Dums Fathead and her brother Tudaloo face off against the menacing gang of Ragamuffins. The story is told by combining real and virtual elements.

Epic Games’ Unreal Engine real-time 3D creation software was the framework in which the project was brought to life. Bousis explains how pre-visualization involved “blocking each scene and dropping virtual cameras with desired lenses inside the game engine to define each shot.”

In addition, Unreal’s MetaHuman Creator was used to generate the synthetic humans. The sibling duo, their mother and about six Ragamuffins were actors filmed on set, while about 30 Ragamuffins and three unique environments were virtual creations. “We used a combination of MetaHumans from Epic Games and volumetric scans of real actors in costume gathered at The Scan Truck,” Thudiyanplackal says.

The story unfolds in three distinct settings — the friendly grasslands created by UME; the Ragamuffin junkyard, created by the Rochester Institute of Technology (unique in offering a PhD in Computer Imaging); and the cave in which the climactic duel takes place, produced by three-time VFX Oscar nominee Scott Squires.

Happy Mushroom then applied a unifying polish to the end product, delivering what’s called “final pixel” to be displayed on the LED volume for the shoot. “You can do things with this technology that you can’t do with traditional visual effects unless you spend tens of millions of dollars,” Weaver says, explaining volumetric production scales budgets from $10,000 to $10 million. “Because you’re working so hard upfront to develop those assets all the effects are created in-camera, rather than afterward.”

Director C. Craig Patterson points out that shooting in a junkyard, or its physical approximation, “would be unsafe and impractical so the volume became our number one ally in the telling the story. It gave us the ability to sculpt every inch of the world to suit the needs of ‘Fathead’ and its characters in ways that would have been financially exhaustive in traditional filmmaking.”

Bousis says a big challenge was “creating parallax,” or proper depth of perspective, by bridging the real and virtual worlds. This was done by “using some real things on the stage, staggering them to create the illusion of depth and continuing that to the illusion on the wall.”

In some ways, it was not unlike the techniques used by realist landscape painters of the 19th century, or makers of ‘50s and ‘60s epics like “Ben Hur” and “Cleopatra” that integrated painted backdrops — though one difference is that the images on the LEDs move.

Another challenge: “We needed to build endlessly loopable animations that would smoothly transition into one another. The team at the stage had various buttons assigned with triggers to launch an animation [on the LED volume] in conjunction with the action on stage with the live actors,” Thudiyanplackal explains.

Tim Reha and Arvind Arumbakkam of Wacom developed the Wacom Brain Trust program to provide the “Fathead” team with Wacom Cintiq pen displays, premium 4K screens and Pro Pen 3D. The goal was to enhance creative breakthroughs across the Unreal Engine virtual production pipeline, 3D scanning, metahumans, virtual fashion and multi-touch displays combined with pressure sensitive pen technology.

The LED volume itself was a combination solution comprised of ROE’s Black Pearl LED displays and, to control it, Brompton Technology’s Tessera software. There are probably only about 50 stages in the world capable of doing anything remotely like this, maybe about a dozen of them high-end on the scale it would take to make a Marvel movie, but more are coming online each year. The volume concept is just starting to take hold, Weaver says, noting “the technology is going to revolutionize production.”

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.