By
Paula ParisiSeptember 5, 2023
Seattle-area startup Irreverent Labs has shifted its focus from blockchain-based video games and NFTs to artificial intelligence. Specifically, it wants to build foundation models for text-to-video generation and related content creation tools. Text-to-video is being explored by several companies but is still in development. Samsung Next was intrigued enough with the proposition to invest an undisclosed sum in Irreverent. While there are several apps that output cartoonish results, ambitious efforts are limited. Animations that aim for photorealism, such as Meta’s Make-a-Video and Runway’s Gen-2, can output only four or five seconds of video at a time. Continue reading Samsung Next Invests in Irreverent Labs’ Text-to-Video Tech
By
Debra KaufmanMarch 23, 2021
In a conversation on cloud-enabled virtual production during the HPA Tech Retreat, Jack Wenzinger of Amazon Web Services’ Global M&E Partners vertical asked how those interested in virtual production can retool existing skills. ETC@USC senior consultant Erik Weaver stated that participating in Epic Games’ Unreal Fellowship program was “an eye-opening experience.” “Understanding what a blueprint is and how to put things in a timeline gave me a fundamental understanding,” he said. “I highly recommend watching all the videos you can on Unreal and start walking through the tutorials.” Continue reading HPA Tech Retreat: Cloud-Enabled Virtual Production – Part 2
By
Debra KaufmanMarch 22, 2021
Jack Wenzinger of Amazon Web Services’ Global M&E Partners vertical moderated a discussion during the HPA Tech Retreat on what’s been learned about cloud-enabled virtual production during the COVID-19 pandemic. He also noted that Epic Games’ Unreal Engine has published a field guide to virtual production that focuses on the work being done in pre-production and post. Joining the conversation was ETC@USC senior consultant Erik Weaver, Solstice Studios chief technology officer Edward Churchward and Mo-Sys Engineering technical director James Uren. Continue reading HPA Tech Retreat: Cloud-Enabled Virtual Production – Part 1
By
Debra KaufmanMarch 19, 2021
When COVID-19 hit, Hollywood (and other filmmaking venues) came to a near standstill, with movie theaters closed and productions halted. As DigitalFilm Tree chief executive Ramy Katrib noted, the M&E business is “uniquely unsuited to social distancing.” But Katrib decided to leverage Cinecode, the tools his company built for virtual production, to see if he couldn’t come up with a way to “visualize” safety on the set. At the Entertainment Technology Center@USC, senior consultant Erik Weaver worked with Katrib and beta-tested the result on the live-action short “Ripple Effect.” Continue reading HPA Tech Retreat: ETC’s ‘Ripple Effect’ Beta-Tests Safetyvis
By
Debra KaufmanNovember 22, 2019
“The Mandalorian,” one of the original exclusive shows on Disney+, follows a mysterious bounty hunter who takes on secretive jobs after the fall of the Empire. Recent data reveals that the show was the third most “in demand” digital original show since its debut. Epic Games is listed in the show’s end credits because series creator Jon Favreau used it in his production process. Epic Games’ Unreal Engine is a popular platform for creating games, such as “Fortnite,” but it’s now being used in more Hollywood productions. Continue reading ‘The Mandalorian’ Uses Epic’s Unreal Engine for Production
By
Debra KaufmanApril 17, 2019
Disney Research and Rutgers University scientists just created an end-to-end model using artificial intelligence to produce a storyboard and video featuring text from movie screenplays. This kind of text-to-animation model is not new, but this research advances the state-of-the-art by producing animations without annotated data or pre-training. The researchers wrote that the system is “capable of handling complex sentences” and is intended to make creatives’ work “more efficient and less tedious.” Continue reading Disney, Rutgers Scientists Use AI to Generate Storyboards
By
Debra KaufmanFebruary 19, 2019
Singularity Imaging founder/chief executive Eric Pohl discussed how drones and photogrammetry methods can be used to extract 3D information and create large point-cloud scenes. Uses include previsualization for production, content for set extensions, VR/AR and gaming applications. Pohl noted that, at last year’s HPA Tech Retreat, a presentation showed how the Unity game engine could be used to map and plan a production. “Mapping and remote sensing are quite mature, but drones bring something new to it,” he noted. Continue reading HPA Tech Retreat: Drones, Photogrammetry as Useful Tools
By
Debra KaufmanApril 12, 2018
Tools powered by artificial intelligence and machine learning can also be used in animation and visual effects. Nvidia senior solutions architect Rick Grandy noted that the benefit of such tools is that artists don’t have to replicate their own work. That includes deep learning used for realistic character motion created in real-time via game engines and AI, as well as a phase-functioned neural network for character control, whereby the network can be trained by motion capture or animation. Continue reading NAB 2018: Artificial Intelligence Tools for Animation and VFX
By
Debra KaufmanAugust 20, 2015
At SIGGRAPH 2015, Autodesk executives David Morin and Ben Guthrie described virtual production, its relationship with virtual reality and some newly released tools from their company to aid in the process. Virtual production began with Peter Jackson’s “Lord of the Rings,” got a bump of recognition with “Avatar,” and has been used on many films since. According to Morin and Guthrie, the process, which lets filmmakers create virtual worlds in-camera and composite CG and live action on set, is achieving momentum. Continue reading SIGGRAPH 2015: Virtual Production, Cousin of Virtual Reality
By
Debra KaufmanAugust 12, 2015
At SIGGRAPH 2015 in Los Angeles, Faceware Technologies, which creates markerless 3D facial motion capture solutions, demonstrated its Faceware Live plugin for Epic Games’ Unreal Engine 4. With the plugin, developers for the UE4 will be able to capture facial movements with any video source and apply them immediately to digital characters. The Unreal Persona animation system displays the facial animation that takes place in real-time. The plugin was shown behind closed doors at SIGGRAPH. Continue reading SIGGRAPH: Faceware Unveils Live Capture for Gaming Engine
By
Debra KaufmanAugust 11, 2015
At SIGGRAPH 2015, the computer graphics convention held this year in Los Angeles, virtual reality executives encouraged the Hollywood production community to begin to develop more content. Oculus Story Studio, the content creation division of Oculus, which makes the Rift virtual reality headset, recently debuted its second short VR film, “Henry.” Creating content for virtual reality comes with a variety of technical challenges, say the experts, but virtual reality won’t take hold until compelling content is available. Continue reading SIGGRAPH 2015: The Road to Producing Virtual Reality Content