Sony Pictures Masters Classic Films in High Dynamic Range

At AMIA’s The Reel Thing conference in Hollywood, Sony Pictures Entertainment senior vice president of technology for production and post production Bill Baggelaar presented a session on HDR video mastering for classic cinema. He first hoped to dispel myths about high dynamic range. “I’ve heard that you need sunglasses to watch HDR, that filmmakers will hate it and that it will be too hard to deliver,” he said. “People also worry that there are too many formats, with HDR10, Dolby Vision, HDR10+ and HLG.” Continue reading Sony Pictures Masters Classic Films in High Dynamic Range

NAB 2017: ETC Panel Tells the Producer’s Perspective on VR

The final panel at ETC’s conference on VR/AR convened producers who have worked on virtual reality projects. Producers Guild of America vice president, new media council John Canning moderated the discussion with producers from ETC@USC, StoryTech Immersive, Digital-Reign and The Virtual Reality Company. StoryTech Immersive president/chief storyteller Brian Seth Hurst spoke about his experiences creating “My Brother’s Keeper,” a 360 spin-off of PBS’s “Mercy Street.” “We were able to get close and intimate with our actors,” he said. Continue reading NAB 2017: ETC Panel Tells the Producer’s Perspective on VR

Epic Games Demos Real-Time Effects for New Branded Short

In “The Human Race,” a short produced by visual effects house The Mill for Chevrolet, the automaker’s new 2017 Camaro ZL races a futuristic Chevy concept car, one driven by a racecar driver and the other by artificial intelligence. This short premiered at the Game Developers Conference to showcase how the car was created via real-time rendering, with the help of the Unreal game engine. Unreal maker Epic Games CTO Kim Libreri demonstrated how aspects of the movie could be changed in real-time, while it was playing. Continue reading Epic Games Demos Real-Time Effects for New Branded Short

RadicalMedia and Uncorporeal Develop Hologram Experience

RadicalMedia has been working on a project to present “great people” as holograms in venues optimized for augmented reality. Although much of the project is under wraps, more became clear recently when RadicalMedia partnered with Uncorporeal, a volumetric capture startup developing technology to create human holograms that can be used in VR or AR content. Headed by Sebastian Marino, formerly visual effects supervisor on “Avatar,” Uncorporeal’s eight staffers are veterans of Lucasfilm, Weta Digital and Electronic Arts. Continue reading RadicalMedia and Uncorporeal Develop Hologram Experience

Researchers Develop Efficient Way to Render Shiny Surfaces

Computer scientists at UC San Diego have developed an efficient technique for rendering the sparkling, shiny and uneven surfaces of water, various metals and materials such as injection-molded plastic finishes. The team has created an algorithm that improves how CG software reproduces the interaction between light and different surfaces (known as “glints”), a technique the team claims is 100 times faster than current state-of-the-art methods, requires minimal computational resources, and is effective beyond still images to include animation. Continue reading Researchers Develop Efficient Way to Render Shiny Surfaces

ILMxLAB Debuts ‘Tatooine’ VR, Develops Darth Vader Projects

Darth Vader is the star of an upcoming Lucasfilm virtual reality project centered on “Star Wars.” Although the project is largely undefined at this point — it has no name, genre, or release date — what we do know is that the story will both reveal new details about Darth Vader’s background and try out some innovative storytelling techniques. Lucasfilm’s ILMxLAB, which accesses award-winning VFX facility Industrial Light & Magic, Skywalker Sound and the “Star Wars” story group, is developing the project. Continue reading ILMxLAB Debuts ‘Tatooine’ VR, Develops Darth Vader Projects

CryWorks: Disney, Pixar, ILM Vets Launch New VR Company

VFX and CGI veterans Euan Macdonald, Hans Uhlig and Kymber Lim have secured funding led by Michael Bay’s 451 Media Group, 500 Mobile Collective, and WI Harper Group to launch an immersive entertainment company called CryWorks, with plans to produce virtual and augmented reality experiences. “Although there are a few high-quality VR content pieces to date, most of them have little incentive for the viewer to keep tuning back in,” said Macdonald. “We see an opportunity to build the first VR broadcast network, partnering with other production companies and creating addictive, episodic experiences.” Continue reading CryWorks: Disney, Pixar, ILM Vets Launch New VR Company

NAB 2016: Sphericam and Liquid Cinema Look to Advance VR

Two companies at last week’s NAB Show, Sphericam and Liquid Cinema, are making interesting contributions to the advancement of VR Cinema. Sphericam is preparing to launch a 6-sensor, 4-microphone spherical camera the size of a baseball into the prosumer market. The camera can internally stitch at 30 fps and, with an attached PC, output 60 fps live video. Liquid Cinema has developed a comprehensive yet simple-to-use software package for editing VR footage, adding effects, and, most interestingly, re-establishing the director’s intent for where viewers should look at cut-points within the video. Continue reading NAB 2016: Sphericam and Liquid Cinema Look to Advance VR

Cloud Conference: Moving From Local to Cloud Infrastructure

ConductorIO VP of business development and operations Monique Bradshaw talked about the paradigm change from local, on-premise infrastructure to the cloud. “The paradigm shift means a fundamental change in approach of underlying assumption,” she said during an ETC Cloud Innovation Conference keynote at NAB. “We’re seeing a big change in the ways that companies are looking at their rendering.” In five years, she noted, 90 percent of respondents to a survey think they’ll have at least some of their rendering in the cloud, up from close to 60 percent today. Continue reading Cloud Conference: Moving From Local to Cloud Infrastructure

Cloud Conference: Challenges to Rendering VFX in the Cloud

Visual effects and rendering in the cloud was the topic of an ETC Cloud Innovation Conference panel at NAB 2016, moderated by Google Cloud Platform senior product manager Srikanth Belwadi. The scope of the issue was made clear by the fact that “The Good Dinosaur” required 110 million compute hours and 300 TB of active data space. Panelists from Thinkbox, Shotgun, Rodeo FX, Avere Systems, and ConductorIO discussed the challenges to producing VFX in the cloud — but also its inevitability.
Continue reading Cloud Conference: Challenges to Rendering VFX in the Cloud

Autodesk and Google Cloud Platform Bring Maya to the Cloud

Autodesk and Google have partnered to offer Maya compute services on the Google Cloud Platform, which developed Google Cloud Platform ZYNC Render, an integrated cloud-based storage and rendering solution for the VFX industry. “We’ve been in discussions with Autodesk for quite some time,” said Google product manager Todd Prives. “It’s been a collaborative effort to bring Maya to Google.” With the ZYNC service for Autodesk Maya 2016 software, users will be able to render 3D scenes on the Google Cloud Platform. Continue reading Autodesk and Google Cloud Platform Bring Maya to the Cloud

Ang Lee’s ‘Long Halftime Walk’ to 4K, 3D, 120 fps Filmmaking

Filmmaker Ang Lee gave a keynote talk at NAB 2016 with editor Tim Squyres and production system supervisor Ben Gervais about the path to creating his upcoming feature “Billy Lynn’s Long Halftime Walk,” which was shot in 3D, 4K, at 120 frames per second. An 11-minute clip from the film ran all afternoon, drawing long lines and buzz. No theater can currently show the movie the way it was shot, but Lee says his curiosity and passion for storytelling led him to explore these formats, which create a compelling immersive experience. Continue reading Ang Lee’s ‘Long Halftime Walk’ to 4K, 3D, 120 fps Filmmaking

HPA Tech Retreat: Next-Gen Cloud Workflows Hosted by ETC

The Entertainment Technology Center@USC hosted a discussion on next-generation cloud workflows, featuring toolsets and specific technologies. Led by ETC’s cloud project lead Erik Weaver, the conversation began with the real-world case study for post production in the cloud implemented by Los Angeles post facility DigitalFilm Tree. That company’s CTO/managing partner Guillaume Aubuchon led the audience through the workflows put in place to handle productions taking place in “remote parts of Asia and Africa.” Continue reading HPA Tech Retreat: Next-Gen Cloud Workflows Hosted by ETC

HPA Tech Retreat: How the Pros Handle HDR in Post Production

In a panel organized by Colorfront’s Bruno Munger and moderated by British Cinematographer magazine editor Ron Prince, a group of executives and engineers tackled the topic of HDR workflow in post production. Netflix production engineer for original content Chris Clark pointing to shows like “Marco Polo,” noting that, “Netflix is obviously really excited about HDR.” The company now sets up a pipeline that enables any production to “flip to HDR” if they want to. “We are all about future-proofing,” he said. Continue reading HPA Tech Retreat: How the Pros Handle HDR in Post Production

HPA Tech Retreat: The Production Workflow Incorporating HDR

A panel of cinematographers, a digital imaging technician and a camera manufacturer talked about HDR production workflow issues that begin in pre-production discussions. The panel’s moderator, VFX cinematographer Mark Weingartner, asked the panelists if there were “fundamental differences between the ways we have been shooting and the way we need to shoot” for HDR. Cinematographer Bill Bennett, ASC noted that “since the inception of cinematography, we’ve been recording HDR images with film.” Continue reading HPA Tech Retreat: The Production Workflow Incorporating HDR