Face Tracking 

See Facial Capture Guidelines Here.

How Do I Access The Face Tracking Feature?

The new Face Tracking Toggle is located under the ‘Animation Output’ settings. Simply turn it on and generate your animations as normal.

How Do I Download The Face Tracking Animation?

The Face Tracking animation will be included as Blendshapes within the normal default animation export, or already retargeted onto your custom character. You can then retarget them to a character of your choice.

Can I use the Custom Character feature with Face Tracking?

Our default characters work the best with face tracking however you can use your own character with a 39 Blendshape subset of the 52 ARKit blendshape standard, or you can use the custom characters generated by the built-in avatar creators Avaturn or Ready Player Me.

Face Tracking Technical Specifications

What do I need to retarget the face animation?

If you use our default characters to create the facial animations you can retarget the blendshape weights conforming to the ARKit Blendshape standard in the animations to your own characters in your favorite DCC tools by yourself. If you use Custom Characters to create the facial animations and your custom characters have a face rig that contains the 39 Blendshape subset of the the 52 ARKit Blendshape standard, the downloaded animations in the .FBX or .GLB will already be retargeted to your custom characters.

Face Tracking Technical Specifications.

Our face tracking output uses a subset of the 52 ARKit Blendshape standard. Our specific set-up includes 39 Blendshapes total and rotations on one head and two eyeball joints. You can use the full standard but have to make sure that the Blendshape Specifications below are followed for your animation retargeting and custom characters to work correctly.

Custom Character Face Tracking Requirements.

Name Your Blendshapes Correctly: Custom characters with the below 39 ARKit blendshapes can be used for face tracking, the full standard 52 Blendshapes can also be used if desired. When face tracking is enabled, Animate 3D will apply the blendshape weights according to the blendshape names. Make sure that your model’s blendshape names are exactly the same as the ARKit standard. If not, rename them before you upload to Animate 3D.

Joint Setup: 

Your character rig needs to have a head joint and two eyeball joints. The eyeball joints should control the rotation of your eyeball mesh, which should be looking straight ahead by default.

Full-Body Humanoid Characters Needed: 

We currently only support full body custom characters, and don’t support head only ones. This is because face tracking is now a supplement to body tracking, and we don’t support capturing face alone. This means your character needs to also satisfy our custom character requirements for body tracking. You can check out our Custom Character FAQ here.

Blendshape Specifications:

These 39 Blendshapes are a subset of the ARKit blendshapes:

"mouthLeft",

"mouthRight",

"mouthSmileLeft",

"mouthSmileRight",

"mouthDimpleLeft",

"mouthDimpleRight",

"mouthStretchLeft",

"mouthStretchRight",

"mouthFrownLeft",

"mouthFrownRight",

"mouthPressLeft",

"mouthPressRight",

"mouthPucker",

"mouthFunnel",

"mouthUpperUpLeft",

"mouthUpperUpRight",

"mouthLowerDownLeft",

"mouthLowerDownRight",

"mouthShrugUpper",

"mouthShrugLower",

"mouthRollUpper",

"mouthRollLower",

"cheekPuff",

"cheekSquintLeft",

"cheekSquintRight",

"jawOpen",

"jawLeft",

"jawRight",

"jawForward",

"browInnerUp",

"browOuterUpLeft",

"browOuterUpRight",

"browDownLeft",

"browDownRight",

"noseSneerLeft",

"noseSneerRight",

"mouthClose",

"eyeBlinkLeft",

"eyeBlinkRight"

FAQ

To quickly find specific topics or keywords, simply use the search bar. Additionally, if you have any additional questions, feel free to reach out to our team through our new Discord.