BODY, FINGERS & FACE
Studio features a fully integrated face tracking solution capable of tracking facial expressions in real-time. Data is captured and integrated seamlessly with the body and fingers. The introduction of face tracking allows users to record full performances effortlessly, without having to synchronize multiple systems.
HOW IT WORKS
VIDEOS
The face tracking pipeline allows users to quickly animate the face of their characters by simply connecting the iPhone or iPad to Studio and pointing the camera at the actor’s face. The data will be supersampled to match the framerate of the inertial system before it is filtered and processed. Users have the freedom to procure or create their own head mounts.
The iPhone uses its camera to create a point cloud of your face. The point cloud is converted to blendshape data and bone animations for eye and jaw movements.
Studio receives and processes the data before applying it to the mannequin. The mask system is used to adjust the influence of each blendshape channel for better results.
The mannequin face animation is mapped and retargeted to the character asset. The result can be exported as a game asset or broadcasted to game engines in real-time.
Read more about our full-body motion capture suit configurations, features and specs.
Read more about our motion capture glove configurations, features and specs.
Read more about our motion capture software solutions and character animation pipelines.
Read more about our iOS app and how to use your iPhone to control a motion capture stage.
Studio features a fully integrated face tracking solution capable of animating faces in real-time. The introduction of face tracking allows users to accomplish full performance capture without having to synchronize multiple systems. Face data consists of blendshapes and bones, which integrate seamlessly with the body and fingers. Face data can be retargeted to character assets in real-time.
By combining face, body and fingers data, at the mannequin level, you can now record full performances. Full performances can be retargeted directly to your character assets using the character engine. Full performances can also be broadcasted to game engines and other third-party software applications in real-time.