Panelists Evaluate Challenges, Benefits of Virtual Production
February 6, 2023
Virtual production in its current form has been around for about two years, but they’ve been light years. From novice to veteran, everyone is still learning the ins and outs, the difference between previs and techvis, 3D and 2.5D. The pandemic fast-forwarded “to about six months a development process that used to happen on a four-year cycle, so a lot of buzzwords and concepts went from zero to 100 percent very quickly,” Lux Machina president Zach Alexander shared at the first annual SVG Silicon Valley Video Summit (SVVS), which brought together more than 400 digital practitioners from numerous areas.
The one-day meet at the Computer History Museum in Mountain View included the session “Virtual Production in the Real World,” led by ETC@USC’s head of virtual & adaptive production Erik Weaver.
“ICVFX, or in-camera visual effects, is getting a lot of heat now because instead of doing shots on green screen and putting the effects in later, we’re able to do them live in the camera, and that’s got people really excited,” is how American Cinematographer virtual production editor Noah Kadner framed the discussion for an audience that included professionals ranging from filmmakers to football shooters and corporate videographers.
While virtual production takes place on a stage where the walls are LED displays against which the live action is shot, it can be helpful not to think of it as a protracted special effect. “We approach virtual production shoots is as if they’re location shoots, as opposed to a mentality where you’re going to fix it in post,” said Voluminous Studios CTO Jay Spriggs. “Whatever you can do ahead of time to prepare for the actual shoot day in the virtual world, you’ll find a lot of savings.”
The process comes with some very specific jargon. “The two extremes are kind of easy to visualize: the 2D is projection, a plate, the content being delivered is going to be a still or a video of some sort,” Alexander explained, adding that “3D is what you see in ‘The Mandalorian,’ where it’s a full [game] engine immersive environment. Your point of view from the camera is essentially you’re a player on the level. You can walk around the back of everything.”
“Whether the asset’s been finished is another discussion, but that’s the concept,” he said. “Two-and-a-half-D is this kind of interesting blend between the two, where you can almost think of it as theater flats, or scrims, where you have different scrim lines and you can bring stuff in. By pairing camera tracking with those 2D plates, you can sort of create a parallax movement as the camera moves.”
Previs is planning, while techvis is “looking at physical characteristics of cameras and dollies and sets and stages, and seeing how they would physically interact,” Kadner said. Stuntvis uses game engines to simulate action shots, from car chases to explosions. Postvis happens after production wraps, “looking at ways we can add more virtual elements into completed shots.”
He added that “simulcam — looking through the camera, the viewfinder, the monitor, we superimpose virtual elements so you can see the dragon that’s not really there in ‘Game of Thrones,’ or the Na’vi who are only half there [on set] in ‘Avatar.’”
Laney College XR Lab founder and Unreal visualization artist Koina Freeman is teaching more than the next generation. “I just spent the past two-and-a-half months on a set with an A-list director prevising a film. He didn’t know anything about the engine, the Unreal Engine, about virtual production.” The technology is so new, it’s a crowded learning curve.
Weaver mentioned that ETC is soon to release a white paper on the short film “Fathead,” the first virtual production to shoot on Amazon’s new volume stage in Culver City, and has already published a 150-page paper on the virtual production short “Ripple Effect.”
No Comments Yet
You can be the first to comment!
Leave a comment
You must be logged in to post a comment.