Epic Games is releasing MetaHuman Animator, which lets developers create nuanced facial animation by capturing an actor’s performance using an iPhone or stereo head-mounted camera system and a PC. The system eliminates the need for manual touch-ups, according to Epic, capturing “every subtle expression, look, and emotion” and replicating it onto a digital character for a faster performance capture workflow that allows more creative control. The new feature set uses a 4D solver to combine video and depth data with a MetaHuman representation of the performer. The animation is produced locally using GPU hardware, providing final results in “minutes.”
A new tool in Epic’s Unreal Engine suite, MetaHuman Animator is being positioned as a money-saver by speeding the process, and Epic says it’s also a creativity enhancer because it lowers the bar for experimentation.
In just a few clicks, facial animation captured using MetaHuman Animator can be applied to any MetaHuman character or any character that uses the new MetaHuman facial description standard. This is possible because Mesh to MetaHuman can now create a MetaHuman Identity from just three frames of video, along with depth data captured using your iPhone or reconstructed using data from your vertical stereo head-mounted camera.
“This personalizes the solver to the actor, enabling MetaHuman Animator to produce animation that works on any MetaHuman character,” Epic explains in a blog post, adding, “it can even use the audio to produce convincing tongue animation.”
The Unreal Engine YouTube channel has a new short called “Blue Dot” that showcases the capabilities of MetaHuman Animator to stunning effect. Created by Epic Games’ 3Lateral team, “Blue Dot” captured the performance of Serbian actor Radivoje Bukvić who delivers a monologue based on a poem by Mika Antić.
The nuanced result “demonstrates the level of fidelity that artists and filmmakers can expect” when using MetaHuman Animator with a stereo head-mounted camera system (which delivers superior results to the iPhone) with traditional filmmaking techniques, Epic says. Guerilla filmmakers can use an iPhone (12 or above) to approximate the results achieved by high-end camera systems.
“That’s possible because we’ve updated the Live Link Face iOS app to capture raw video and depth data, which is then ingested directly from the device into Unreal Engine for processing,” the blog post explains.
The captured animation data supports timecode, so facial performance animation can easily be aligned with body motion capture and audio to deliver a full character performance. A second video offers step-by-step instructions on “How to Use MetaHuman Animator in Unreal Engine.”
Epic previewed MetaHuman Animator in March at the Game Developers Conference, and it “is now available for developers to try out for themselves,” The Verge writes.
No Comments Yet
You can be the first to comment!
Leave a comment
You must be logged in to post a comment.