This is done by taking in face tracking weights (an array of weights based on the Facial Action Coding System (FACS)), and applying it to a SkinnedMeshRenderer by updating the blend shapes on the SkinnedMeshRenderer every frame. As the values from the face tracking data may differ in range from the blend shape range on the skinned mesh renderer, a modification of values is required by setting the
OVRFace.BlendShapeStrengthMultiplier which will multiply the input before setting the blend shape values on the SkinnedMeshRenderer. For more information, please see
Face Tracking for Movement SDK for Unity.
Intended to be used as a base type that is inherited from in order to provide mapping logic from blend shape indices. The mapping of
OVRFaceExpressions.FaceExpression to blend shapes is accomplished by overriding OVRFace.GetFaceExpression(int). Needs to be linked to a
OVRFaceExpressions component to fetch tracking data from.