Movement advanced topics
Updated: Oct 10, 2024
This plugin provides animation nodes that allow you to use Meta’s movement features from within Unreal’s Animation Graph.
After completing this section, the developer should:
- Understand the different tracking nodes provided by the OculusXRMovement plugin.
- Understand the purpose of the configuration settings for each of the nodes.
Note: Currently, the OculusXRMovement plugin is distributed in source. See
OulusXRMovement Plugin for instructions.
OculusXR Body Tracking node The Body Tracking node will take body tracking data from the Movement API and apply it to your character’s skeleton during runtime.
To implement body tracking, add the
OculusXR Body Tracking node to your Animation Blueprint and hook up inputs and outputs. For a detailed guide on how to set it up, see
Animation Node Implementation.
Any bones you are retargeting will have its pose overwritten by the body tracking data. Place this node at the beginning of your node graph and perform additional operations (such as animation blending) after this node.
Any pose data that isn’t modified by this node will be passed through.
![OculusXRBodyTrackingNode]()

Property | Description |
---|
BoneRemapping | The map between the OculusXRBones and the Target Skeleton Bones. If you use a different skeleton than the Unreal Engine 5 Mannequin, you will need to remap the bones accordingly. |
LocalCorrectionsMap | A map of local bone adjustments. This is used to correct the orientation and position of incorrectly aligned bones. |
RetargetingMode | The mode used to retarget the animation data |
ForwardMesh | The forward axis of the Target Skeleton (your avatar). This is used to correctly calculate the orientation of the bones during retargeting. |
OculusXR Face Tracking node The Face Tracking node will take face tracking data from the Movement API and apply it to your character’s facial morph targets during runtime.
{:width="600px"}
To use it, add it to your Animation Blueprint and hook up inputs and outputs. See
Animation Node Implementation.
Any morph targets that are being driven by this node will have their curve overwritten by the facial tracking data. Add this node at the beginning of your node graph. Add additional operations like animation blending after this node. Any pose data not modified by this node will pass through unchanged.
Property | Description |
---|
Expression Names | This is the map that defines which morph targets on your character should be affected when the face tracking detects changes. For each expression, input the name of your character’s relevant morph target. Here’s a reference visually showing what each morph target refers to in the Morph Target Visual Reference section. MorphTargets |
Expression Modifiers | Expression modifiers allow the user to amplify or clamp specific morph target values for facial expressions. Use this when face tracking is causing too large or too small of changes in your character’s face. Min Value: Set the lowest value of the morph target. Max Value: Set the highest value of the morph target. Multiplier: Increase/decrease the effect of the morph target. |
Expression Modifiers | Expression modifiers allow the user to amplify or clamp specific morph target values for facial expressions. Use this when face tracking is causing too large or too small of changes in your character’s face. Min Value: Set the lowest value of the morph target. Max Value: Set the highest value of the morph target. Multiplier: Increase/decrease the effect of the morph target. |
Face Expression Modifiers
The characters’ faces may not always align correctly with their physical counterparts.
Example: When Aura is smiling with her mouth closed, her lips can appear behind her teeth. By adding an expression modifier that reduces the magnitude of the smile, Aura’s teeth are prevented from clipping through her lips when she smiles.
OculusXR Eye Tracking node The Eye Tracking node will take eye tracking data from the Movement API and apply it to your character’s eye-bones during runtime.
To use it, add it to your Animation Blueprint and hook up inputs and outputs. See
Animation Node Implementation.
When retargeting bones, the body tracking data will overwrite their pose. Add this node at the beginning of your node graph to influence the eyes and perform additional operations, such as animation blending after this node.
Property | Description |
---|
Left Eye Bone | The name of your character’s left eye bone. |
Right Eye Bone | The name of your character’s right eye bone. |
After completing this section, the developer should understand the different retarteting modes should they choose to not use the default mode for Rotation and Hips.
Retargeting is implemented with different modes that define how the incoming body tracking data is combined with the existing skeleton of your avatar. Each mode comes with some features, drawbacks, and limitations. They are named according to how they deal with the incoming body tracking data.
Retargeting mode: Rotation & Positions This is the default mode and provides the most accurate reflection of actual movement.
- The character is first scaled to match the user’s size with height based on the wrist positions in a T-Pose.
- The arms are scaled in length to match the user’s wrist positions. Finally, the finger joints are aligned with the tracking skeleton and the hands dynamically scaled to match the tracking input.
- The tracking skeleton is then used to derive both position and rotation transforms to apply to this scaled character to provide accurate body movement.
This means:
- Your avatar will scale, stretch, shrink to match your actual size.
- Your avatar’s proportions will not be preserved.
- Deformations in your character’s mesh are minimized due to the scaling. In this case, the mesh should still allow for some variation in joint positions because tracked positions may vary.
- Actions like clapping your hands should work naturally.
Retargeting mode: Rotation & Positions - Hands Rotation Only This mode is identical to Rotation & Position except for the hands. With the hand bones (e.g. fingers), only rotations are applied. This is helpful because in using controllers, we don’t have an accurate position of hands. As such, if you are changing from hand to controller tracking, you might see a change in the size of the hands when switching between the two.
This means:
- Your avatar will scale, stretch, shrink to match your actual size.
- Your avatar’s proportions will not be preserved except for hands.
- Hands will be consistent between controllers and hand tracking.
On each bone of the skeleton, only rotation data from the body tracking applied to the bone.
The size of the character is scaled relative to the wrists along the up axis when in a T-Pose.
This means:
- Your avatar will scale, shrink to match your actual size.
- Your avatar’s proportions will be preserved.
Retargeting mode: Rotation Only - No Scaling On each bone of the skeleton, only rotation data from the body tracking applied to the bone. The size of the imported character will not be changed.
This means:
- Your avatar will NOT match your actual size.
- Your avatar’s proportions will be preserved.
- The character’s rotations will follow that of the tracked user, but the positions will be inaccurate.
- Since proportions on the character are likely different from the tracked person, movements like clapping your hands may cause body-to-body intersection.
Bone mapping for a custom rig
Body tracking retargeting works on humanoid skeletons, which consists of five primary bone chains.
- Spine: Starts from the hip and forms the spine up to the head.
- Legs (x2): Starts from its attachment to the hips and goes all the way to the feet and toes.
The humanoid character may have additional bone chains for each finger or toe depending on the fidelity of the character.
To set up the correct bone mapping on your custom avatar, you need to identify these bone chains and map them to the corresponding bones in the Oculus Body Tracking Skeleton.
Navigate to the OculusXR Body Tracking animation node and fill in the BoneRemapping map with the names of the bones in your custom avatar, according to the following guide:
- Identify the first bone in your character’s spine and enter the name into the map: BodyHips: MyHipBoneName. It is the parent of the two legs and the base of the spine. In the case of Mannequin this is called pelvis
- Identify the last bone in your character’s spine and enter the name into the map: BodyHead: MyHeadBoneName. This is the bone at the top of the spine bone chain.
- Identify the BodyChest bone and enter the name into the map: BodyChest: MyChestBoneName. This is the bone that connects the two arms to the spine.
- Now fill in the remaining names of the bones in the chain:
- BodyHips > BodySpineLower > BodySpineMiddle > BodySpineUpper > BoneChest > BodyNeck > BodyHead.
- If your character has more bones in the spine chain than this, you need to determine which of the bones that should be added here.
- If your character has fewer bones in the spine chain than this, you need to leave one or multiple of the bones empty.
- Identify the BodyRightShoulder bone of your character and enter it into the map: BodyRightShoulder: MyRightShoulderBoneName. This is the bone that connects the right arm to the chest. When this bone is rotated, the right shoulder should move. In the case of Mannequin this is called clavicle_r
- Identify the BodyRightHandWrist bone of your character and enter it into the map: BodyRightHandWrist: MyRightHandWristBoneName. This is the bone at the end of the right arm. When this bone is rotated, the right hand should move. In the case of Mannequin this is called hand_r.
- Now fill in the remaining names of the bones in the chain:
- BodyRightShoulder > BodyRightScapula > BodyRightArmUpper > BodyRightArmLower > BodyRightHandWrist
- For the Mannequin skeleton this is: clavicle_r > None > upperarm_r > lowerarm_r > hand_r The Mannequin does not have Scapula, so we leave it empty.
- If your character has more bones in the chain than this, you need to determine which of the bones that should be added here.
- If your character has fewer bones in the chain than this, you need to leave one or multiple of the bones empty.
- Repeat this for the left arm.
- Identify BodyRightUpperLeg bone of your character and enter it into the map: BodyRightUpperLeg: MyRightUpperLegBoneName. This is the bone that connects the right leg to the hips. When this bone is rotated, the right leg should move. In the case of Mannequin this is called thigh_r.
- Identify BodyLeftFootBall bone of your character and enter it into the map: BodyLeftFootBall: MyRightFootBallBoneName. This is the bone at the end of the right leg. When this bone is rotated, the right foot should move. In the case of Mannequin this is called ball_r.
- Now fill in the remaining names of the bones in the chain:
- BodyRightUpperLeg > BodyRightLowerLeg > BodyRightFootAnkle > BodyRightFootBall.
- For the Mannequin skeleton this is: thigh_r > calf_r > foot_r > ball_r.
- If your character has more bones in the chain than this, you need to determine which of the bones that should be added here.
- If your character has fewer bones in the chain than this, you need to leave one or multiple of the bones empty.
- Repeat this for the left leg.
- Identify the start of the bone chain for the index finger. Enter it into the map: BodyRightHandIndexMetacarpal: MyRightHandIndexMetacarpalBoneName. This is the bone that connects the index finger to the wrist. In the case of Mannequin this is called index_metacarpal_r.
- Identify the end of the bone chain for the index finger: Enter it into the map: BodyRightHandIndexDistal: MyRightHandIndexDistalBoneName. This is the bone at the end of the index finger.
- Now fill in the remaining names of the bones in the chain:
- BodyRightHandIndexMetacarpal > BodyRightHandIndexProximal > BodyRightHandIndexIntermediate > BodyRightHandIndexDistal.
- For the Mannequin skeleton this is: index_metacarpal_r > index_01_r > index_02_r > index_03_r.
- Repeat this for each finger on both hands. Note that Thumb may have a different bone chain than the other fingers.
The Oculus Body Tracking Skeleton has bones for the tip of fingers. These can be useful if you need to detect when fingertips are touching something. Since these are leaf-bones in the skeleton, they don’t need to be mapped to the avatar if you don’t need them. Mannequin does not have bones for the fingertips.
The Movement SDK does not provide support for detailed tracking of feet and toes. The Oculus Body Tracking Skeleton does not include bones for the feet, so it is not possible to map bones for the feet and individual toes to the Oculus Body Tracking Skeleton.
Configuring an avatar for tracking
After completing this section, the developer should:
- Understand the relationship of the character to Live Link.
- Be able to apply body tracking to a character model using Live Link or Components.
In addition to the Animation Node method described in the main body, you can also set up your avatar using Live Link or using a component based Method.
MovementSDK is exposed through the Live Link interface within Meta MovementSDK Live Link source.
This source exposes three subjects: Body, Face and Eye. When the MovementSDK Live Link source is active and a connected device supports a subject, it pushes the most recent data from corresponding trackers to the Live Link client in every tick.
This functionality is available in Unreal Editor and in-app on all platforms.
To use Meta MovementSDK Live Link source, create a Live Link preset in Unreal Editor.
- Open the Live Link Streaming Manager tab (Window > Virtual Production > Live Link).The Live Link tab will appear in your environment.
- Click Source and choose Meta MovementSDK Live Link.
- Click the Presets button and choose Save As Preset to save it as an asset.
If your device is not connected to PC with Link (as well not connected to Unreal Editor), Meta XR source will appear as Not Supported. This is expected and will not create an issue when saving and using this preset later, when the device is connected.
Setting up a character for body tracking with Live Link - Ensure you have an active Live Link set-up following the instructions found in our public documentation: Enabling Live Link. If that step has been done correctly, you will see the Body subject in the list of available subjects.

- Create an instance of
OculusXRLiveLinkRetargetBodyAsset
for your project and give it a name of your choice.
That body retarget asset defines how tracking data for the Oculus skeleton is being transformed to the Mannequin skeleton, including mapping bone names and correcting rotation and position. - Open the Body Retarget asset and recompile it. Modify corrections as needed.
- Create an Animation Blueprint, and select the SK_Mannequin skeleton to create the Animation Blueprint and give it a name of your choice.
- Edit the Animation Blueprint to add Live Link Pose and provide it as an input for the OutputPose.
- Select Body as the subject name to connect to the Live Link source.
- On the Details tab of the Live Link Pose node in the Retarget section, select OculusXRLiveLinkRetargetBodyAsset as the Retarget Asset (the instance created in the step 2)

- Compile and save the Blueprint.
- Create a Pawn Blueprint and give it a name of your choice.
- Add a skeletal mesh to the Pawn Viewport inside the DefaultSceneRoot that is compatible with the SK_Mannequin skeleton (SKM_Manny_Simple could be a good asset for experimenting).
- Under the animation parameter of the added skeletal mesh, select the Animation Blueprint created in step 4.
- Compile and save the Pawn Blueprint.

- Within the Pawn’s event graph, we need to set up the Live Link preset.
- Create a new variable to hold the LiveLinkPreset with the type Object Reference to Live Link Preset. Compile the Blueprint.

- Select MetaXRLiveLinkPreset as a default value for that variable. This the preset which you used for setting up Live Link for the project at step 1.

- At the BeginPlay event, use the Apply to Client function, and use the LiveLinkPreset variable as a parameter.
If you cannot find the Apply to Client function, click the Context Sensitive toggle.

- Compile and save the Blueprint.
- Add the Pawn created in Step 2 to the level.
- IOBT and FBS should be enabled in Project Settings (scroll to the Mobile group of the MetaXR section). High fidelity of Body Tracking Fidelity means IOBT, Low fidelity means 3pt tracking. Full Body in Body Tracking Joint Set means FBS.

When you have done that, your avatar should be managed by body tracking.
Live Link starts body tracking automatically. But you can also use the Start Body Tracking by Joint Set node. The joint set cannot be changed in the run time without restarting (stopping and starting again) of body tracking.
Setting up a character for body tracking with a component Body Tracking in Unreal can be driven by the BodyTrackingComponent. This is an alternative for the Live Link method and should not be used along with it.
In each frame, the BodyTrackingComponent component requests the most up-to-date tracking data and applies that data to a set of known bones. Configuring the BodyTrackingComponent requires specifying the names of bones on a skeletal mesh, which matches to the set of tracked bones available in OpenXR.
The BodyTrackingComponent is a derivation of PoseableMeshComponent. The component will not function without a skeletal mesh. The skeletal mesh provided to the component will be manipulated as a result of tracking.
The bone mappings are configured via the bone names field under the OculusXR->Movement properties header.
The list of bones known to OpenXR will be populated on the left-hand side. On the right-hand side there is a list of bone names. The list of bones is auto-populated when you create a new BodyTrackingComponent.
Replace all bone names in this list with the associated bones in your own avatar. If you are missing bones, the component will still function.
Eye tracking in Unreal is driven by the EyeTrackingComponent. The eye-tracking component will request up-to-date eye-tracking data and then apply the tracking data to simulated eye joints on a target mesh component.
Configuring eye tracking involves specifying PoseableMeshComponent target on the actor and defining the joints used for displaying the results.
The EyeTrackingComponent is derived from ActorComponent and has the following properties under the OculusXR > Movement property heading.
You are required to specify the Target Mesh Component Name. This can be any PoseableMeshComponent but most likely will be a BodyTrackingComponent if doing eye and body tracking at the same time. The PoseableMeshComponent must also be on the same actor as the EyeTrackingComponent.
You are required to define at least one valid eye mapping. The eye mappings consist of an eye target “Left” or “Right” and a joint name. The joint name must match one of the joint names existing on the SkeletalMesh asset contained within the Target Mesh Component.
Face tracking for Unreal is driven by a FaceTrackingComponent. In each frame, the face tracking component will request the newest face tracking data and apply that data to blend shapes on a specified SkinnedMeshComponent.
To configure face tracking, you must specify the SkinnedMeshComponent on the actor and define the morph target (or blend shape) names to display the tracking results.
Properties related to face tracking can be found on the FaceTrackingComponent under OculusXR->Movement.
Specify the names of morph targets (blend shapes) in the Expression Names map. The names on the left line up with known OpenXR expressions. The names on the right are the names of your own mesh’s morph targets (blend shapes).