Article 55K0V Unreal’s New iPhone App Does Live Motion Capture with Face ID Sensors

Unreal’s New iPhone App Does Live Motion Capture with Face ID Sensors

by
martyb
from SoylentNews on (#55K0V)

upstart writes in with an IRC submission:

Unreal's new iPhone app does live motion capture with Face ID sensors:

Unreal Engine developer Epic Games has released Live Link Face, an iPhone app that uses the front-facing 3D sensors in the phone to do live motion capture for facial animations in 3D projects like video games, animations, or films.

The app uses tools from Apple's ARKit framework and the iPhone's TrueDepth sensor array to stream live motion capture from an actor looking at the phone to 3D characters in Unreal Engine running on a nearby workstation. It captures facial expressions as well as head and neck rotation.

Live Link Face can stream to multiple machines at once, and "robust timecode support and precise frame accuracy enable seamless synchronization with other stage components like cameras and body motion capture," according to Epic's blog post announcing the app. Users get a CSV of raw blendshape data and an MOV from the phone's front-facing video camera, with timecodes.

[...] For those not familiar, Unreal Engine began life as a graphics engine used by triple-A video game studios for titles like Gears of War and Mass Effect, and it has evolved over time to be used by indies and in other situations like filmmaking, architecture, and design. It competes with another popular engine called Unity, as well as in-house tools developed by various studios.

Original Submission

Read more of this story at SoylentNews.

External Content
Source RSS or Atom Feed
Feed Location https://soylentnews.org/index.rss
Feed Title SoylentNews
Feed Link https://soylentnews.org/
Feed Copyright Copyright 2014, SoylentNews
Reply 0 comments