Video: NAB Panel on ETC Film Project and Virtual Production

Last week, NAB and sponsor Grass Valley hosted a day of online sessions covering various “Production in a Pandemic” topics. ETC@USC’s Erik Weaver moderated a compelling panel during which industry experts discussed the methods used to produce the short film “Ripple Effect,” a live-action project testing the limits of virtual production. The production focused on how Previz, Techviz and Safetyviz can help limit crew and cast to create a safer work environment. Video of the panel — “In Harm’s Way: Using Safetyviz to Mitigate Onset Liability” — is now available on the NAB Show site. Continue reading Video: NAB Panel on ETC Film Project and Virtual Production

NAB Panel to Address Virtual Production of ETC Film Project

NAB Show, with sponsor Grass Valley, will offer online sessions September 2 that address topics related to “Production in a Pandemic.” In partnership with the Entertainment Technology Center @ USC, the day’s first session — “In Harm’s Way: Using Safetyviz to Mitigate Onset Liability” (10:00 am PT) — will feature a panel of industry experts discussing the methods used to produce the short film “Ripple Effect.” The project focused on how Previz, Techviz and Safetyviz can help limit crew and cast to create a safer work environment.  Continue reading NAB Panel to Address Virtual Production of ETC Film Project

Weta Digital Opens Virtual Production Service in New Zealand

Visual effects company Weta Digital — founded by Peter Jackson, Richard Taylor and Jamie Selkirk — joined forces with production facility Avalon Studios and live event, production and broadcast specialist Streamliner Productions to develop an LED-stage virtual production service based in Wellington, New Zealand. That country has done a good job of controlling COVID-19, making it an appealing destination for new TV and film productions. Similar to ILM’s StageCraft platform, Weta Digital’s system is based on Epic Games’ real-time Unreal Engine. Continue reading Weta Digital Opens Virtual Production Service in New Zealand

HPA Tech Retreat: The DPP’s Report on Next Gen Production

The Digital Production Partnership (DPP) is a London-based media industry business network, with a membership of 400+ companies representing the entire content supply chain. Eighteen months ago, the DPP issued the Production Business Survey report, detailing the operations of 57 production companies vis-à-vis the cloud. What it found was discouraging, so the DPP set its sights on another survey on Next Generation Production. At the HPA Tech Retreat, managing director Mark Harrison presented the findings. Continue reading HPA Tech Retreat: The DPP’s Report on Next Gen Production

HPA Tech Retreat: ETC Outlines Adaptive Production Projects

ETC’s director of adaptive production Seth Levenson described the USC think tank’s array of projects under this umbrella, which include archiving, blockchain, and visual effects standards. The working group on archiving, co-chaired by Paramount Pictures senior vice president asset management Andrea Kalas, is developing best practices for cloud preservation. Levenson pointed to the white paper on “Guidelines for Digital Audio-Visual Assets in the Cloud,” which in part focused on fixity, or getting out the same assets that were uploaded. Continue reading HPA Tech Retreat: ETC Outlines Adaptive Production Projects

HPA Tech Retreat: Virtual Production for Mainstream Projects

Virtual production, used in big budget movies such as “The Lion King” and “Jungle Book,” relies on game engine technology to marry CGI backgrounds with live actors in real-time. As such, it’s is a cutting edge production technique. But, noted International Cinematographers Guild (ICG) advanced production technology specialist Michael Chambliss, virtual production can actually be used on more mainstream productions with smaller budgets. He moderated a panel of industry experts with experience in doing just that. Continue reading HPA Tech Retreat: Virtual Production for Mainstream Projects

HPA Tech Retreat: The Latest Workflows for Virtual Production

The HPA Tech Retreat kicked off with an ambitious daylong demo that highlighted innovations in content creation, management and distribution technology and workflows. Supersession chair Joachim Zell, VP technology for EFILM walked the audience through numerous elements of an HDR production: filming, editing and finishing two scenes that provided the final chapters for a short film. The process, much of which involved workflows in the cloud, featured multiple cameras, on-set management and collaboration platforms, editorial, dailies and digital intermediate color grading systems, as well as online mastering and distribution platforms. Continue reading HPA Tech Retreat: The Latest Workflows for Virtual Production

‘The Mandalorian’ Uses Epic’s Unreal Engine for Production

“The Mandalorian,” one of the original exclusive shows on Disney+, follows a mysterious bounty hunter who takes on secretive jobs after the fall of the Empire. Recent data reveals that the show was the third most “in demand” digital original show since its debut. Epic Games is listed in the show’s end credits because series creator Jon Favreau used it in his production process. Epic Games’ Unreal Engine is a popular platform for creating games, such as “Fortnite,” but it’s now being used in more Hollywood productions. Continue reading ‘The Mandalorian’ Uses Epic’s Unreal Engine for Production

Real-Time Virtual Production Moves into Television and Film

To be able to bring real-time feedback to every step of the filmmaking workflow has been both the Holy Grail of production and an impossible dream. When Bradley Weiers began to work in film production, he chafed at the delayed feedback and found that the real-time ecosystem of game production was a better fit. Now head of immersive storytelling at Unity Technologies, Weiers said that for the first time, he believes he can connect his first love, film, and the preferable tools of gaming.  “There’s a bridge to cross over,” he suggested during a panel at NAB 2019. Continue reading Real-Time Virtual Production Moves into Television and Film

Nvidia Quadro RTx Chips Offer AI and Real-Time Ray Tracing

Nvidia unveiled new Turing architecture during a keynote at SIGGRAPH 2018 as well as three new Quadro RTx workstation graphics cards aimed at professionals. Nvidia dubs the Turing architecture as its “greatest leap since the invention of the CUDA GPU in 2006.” The RTx chips are the first to use the company’s ray tracing rendering method, which results in more realistic imagery. Also at SIGGRAPH, Porsche showed off car designs accomplished with Epic Games’ Unreal engine and Nvidia’s RTx chips. Continue reading Nvidia Quadro RTx Chips Offer AI and Real-Time Ray Tracing

Epic Games Demos Real-Time Effects for New Branded Short

In “The Human Race,” a short produced by visual effects house The Mill for Chevrolet, the automaker’s new 2017 Camaro ZL races a futuristic Chevy concept car, one driven by a racecar driver and the other by artificial intelligence. This short premiered at the Game Developers Conference to showcase how the car was created via real-time rendering, with the help of the Unreal game engine. Unreal maker Epic Games CTO Kim Libreri demonstrated how aspects of the movie could be changed in real-time, while it was playing. Continue reading Epic Games Demos Real-Time Effects for New Branded Short

SIGGRAPH 2015: Virtual Production, Cousin of Virtual Reality

At SIGGRAPH 2015, Autodesk executives David Morin and Ben Guthrie described virtual production, its relationship with virtual reality and some newly released tools from their company to aid in the process. Virtual production began with Peter Jackson’s “Lord of the Rings,” got a bump of recognition with “Avatar,” and has been used on many films since. According to Morin and Guthrie, the process, which lets filmmakers create virtual worlds in-camera and composite CG and live action on set, is achieving momentum. Continue reading SIGGRAPH 2015: Virtual Production, Cousin of Virtual Reality