Movement sample for unreal
Updated: Sep 4, 2024
After completing this section, the developer should:
- Understand the context of the samples provided.
- Be able to navigate the samples to find sample code that can be applied to the project.
The Movement sample for Unreal consists of four scenes, each of which demonstrates a slightly different use case. The first two are implemented in the older Actor Component method while the second two utilize the Animation Node method (preferred). The scenes are located in the “Content/Maps” folder.
Face (“Aura”): (“MAP_Aura”) This is a face tracking scene showing how to utilize face tracking and eye tracking using the Actor Component nethod.
Upper Body: (“MAP_HighFidelity”) This is a realistic character with implemented body tracking, face tracking, and eye tracking. This scene utilizes the Oculus rig directly and is not compatible with the Unreal Mannequin.
Retargeting - Mannequin: (“MAP_RetargetMannequinAnimBlueprint”) This scene shows retargetiing options using Mannequin and the Animation Node method. However, the techniques used in this sample can be used for other rigs. This is the most complete example for body tracking.
Face, Eye, and Body Retargeting: (“MAP_HighFidelity_AnimBlueprint”) This sample showcases applying body, face and eye tracking on a realistic character model using the Animation Node method.
The following is the scene select menu shown in the samples to allow you to change between the above scenes.
All samples use a common set of components as described in this section.
- Primary Pawn This is the pawn you will be possessing when entering VR. A Camera is attached to this component and it showcases first person embodyment
- Mirrored Pawn This is identical to the primary pawn, however: it is scaled to (-1, 1, 1) to achieve mirroring across the X axis and does not have a Camera component. This is the pawn that will show you how the Oculus Pawn looks from an external perspective, or third-person embodyment.
- Scene Handler Used for setting the environment and possessing the primary avatar and contains the logic for level switching.
In order to switch between levels, you can use the Scene Swtich UI shown in the figure above.
- Using Controllers: Point towards the buttons in the UI and press the
A
button on the right controller to select. - Using Hand Tracking: Point towards the buttons in the UI and pinch with your thumb and index to select.
Aura - face and eye tracking
This sample showcases the facial and eye tracking capabilities of the Movement SDK. Aura is our character that is modeled to have slightly stylistic and fantastical features; larger than normal eyes and floral petals for hair. Because Aura’s proportions are slightly different from a human, the facial tracking data needs to be adjusted to fit the character and constrained to ensure the character doesn’t clip through her own geometry when moving or animating. This is achieved by setting constraints on how the facial data is applied to the character, using facial expression modifiers. Aura has free floating head and hands which means applying a 3-point body tracking (head, 2x hands) will produce convincing results since there are no elbows or knees to worry about.
Aura is retargeted using the Actor Component implementation. By opening the map and inspecting the pawn you can see that she has the following components added:
- OculusXRFaceTracking implements face tracking triggers to drive morph targets.
- OculusXRBodyTracking implements body tracking to drive the hands.
- OculusXREyeTracking implements the eye tracking interface.
Additionally she has the following components:
- AC_BoneHide hides bones that should not be displayed.
- OculusXRFaceTrackingCorrectives adds tongue tracking support.
These are components that all improve the tracking in various ways. They live within the sample and you can either use them for inspiration for achieving similar functionality or you can migrate them over to your project.
Face Expression Modifiers The different characters’ faces might not always align with the physical face correctly which is why face Expression Modifiers are needed. The face morph targets can be modified by either completely disabling them or increasing/decreasing the effect of what the physical face does. For example, when the Aura character smiles, her lips can appear behind her teeth, but you can rectify this by using a modifier for the lips.
The face Expression Modifiers are enabled in the Face Tracking component. They are an array of 33 elements.
An individual element is shown below.
- Face Expressions is the array for targeting multiple expressions at once (preferably two - for left and right sides of the face)
- Min Value contains the lowest value of the morph target.
- Max Value contains the highest value of the morph target.
- Multiplier used to increase/decrease the effect of the morph target.
This sample showcases how to use the Movement SDK with a half body character. This is our 3D scanned character and has realistic proportions. This means the tracking data can be applied directly onto the character. Note that we need to apply Twist Distribution to the arms to make sure the arms don’t twist in an unnatural way.
The model is set up using the Actor Component implementation using the following features:
- OculusXRFaceTracking implements face tracking triggers to drive morph targets.
- OculusXRBodyTracking implements body tracking to drive the hands.
- OculusXREyeTracking implements the eye tracking interface.
Additionally he has the following components:
- AC_InverseKinematic is used for repositioning the elbows.
- AC_TwistDistribution provides twists so that the wrists don’t show candy-wrapping.
Retarget Animation Blueprint - Mannequin
This sample showcases applying body tracking to the Unreal 5 Mannequin rig, using the
OculusXRMovement Body Tracking animation node. Since the Unreal Mannequin skeleton is different from the Oculus Body Tracking Skeleton, a bonemap is needed to map the tracking skeleton names to the target skeleton bones. The
OculusXRMovement Body Tracking node supports
Upper and
Full Body tracking. Through the UI panel on the side, you can select between the two modes in runtime. When the sample is using upper body tracking we have no information about the legs. Instead Foot Grounding keeps the feet on the floor. It is the process of running traces down to the ground and using IK to place the feet there so that it appears natural.

Mannequin is set up using the Animation Blueprint implementation using the following features:
- OculusXRBodyTracking node implements body tracking in the Animation Node method.
Additionally, this character has the following components:
- AC_FootGrounding is responsible for tracing down to the ground and finding possible positions for grounding feet.
- AC_FBSChecker checks if Full Body mode is active.
Face, eye, and body retargeting
This sample showcases applying body, face and eye tracking on this sample, hence full tracking. Compared to the sample mentioned earlier, this example uses the Animation Node implementation of body, face and eye Tracking.
This model is set up using the Animation Blueprint implementation using the following features:
- OculusXRFaceTracking implements face tracking triggers to drive morph targets.
- OculusXRBodyTracking implements body tracking to drive the hands.
- OculusXREyeTracking implements the eye tracking interface.
Additionally this sample has the following components:
- AC_FootGrounding is responsible for tracing down to the ground and finding possible positions for grounding feet.
- AC_FBSChecker checks if Full Body mode is in use.
This section outlines all blueprints in the sample project and explains their purpose:
Name | Purpose | Path |
---|
AC_BoneHide | Actor component used to hide bones. Used in the half-body samples. | Blueprints/AC_BoneHide
|
AC_AndroidPermissions | Helper component used for requesting permissions on Android. | Blueprints/AC_AndroidPermissions
|
AC_DeformationLogic | Actor component used to deform the body of the character. | Blueprints/Deformation/AC_DeformationLogic
|
AC_DriveSkeletalUpdateLogic | Actor component used to drive the skeletal update logic. | Blueprints/AC_DriveSkeletalUpdateLogic
|
AC_TwistDistribution | Actor component used to distribute the twist of the joints. | Blueprints/TwistDistribution/AC_TwistDistribution
|
BP_AuraOculusPawn | The Oculus Pawn for the Aura character. | Blueprints/Avatars/BP_AuraOculusPawn
|
BP_AxisGixmo | A gizmo used for showing the orientation of Joints. | Blueprints/Debug/BP_AxisGizmo
|
BP_MovementSampleGameMode | The Game Mode for the samples. Responsible for setting some render commands and requesting relevant permissions | Blueprints/BP_MovementSampleGameMode
|
BP_OculusPawn | The base Oculus Pawn from the MetaXR plugin. | Blueprints/Avatars/BP_OculusPawn
|
BP_Printer | Actor used to print debug information to a screen in world-space. | Blueprints/Debug/BP_Printer
|
BP_SceneHandler | Actor used to setup the environment and possess the Main Avatar. One of these in each sample level. | Blueprints/BP_SceneHandler
|
WBP_BodyTrackingSettings | Widget used to display the body tracking settings. | Blueprints/Widgets/WBP_BodyTrackingSettings
|
WBP_LevelSwitch | Widget used to switch between levels. | Blueprints/Widgets/WBP_LevelSwitch
|
WBP_Recalibrate | Widget used for triggering recalibration of body tracking height | Blueprints/Widgets/WBP_Recalibrate
|
WBP_SkeletonDebugSettings | Widget used to enable or disable various debug views for body tracking skeleton. | Blueprints/Widgets/WBP_SkeletonDebugSettings
|
Twist distribution - Blueprint To achieve correct bone/joint rotation and not make body parts look like candy-wrappers, add a Twist Distribution component to adjust the joints. (This is not needed when using Animation Node implementation.) This can be seen in hand rotating where the elbow bone and the wrist bone are rotated but the joints’ rotation is not correctly distributed which leads to candy-wrapper look:
Without twist distribution:
With twist distribution:
Properties
- Weight: How much of an effect will the twist have.
- Segment Start: The start bone on the opposite side of the twist source.
- Segment End: The target bone containing the twist.
- Segment End Up: Optional. Assign a different bone to be used as a Segment End up vector.
- Twist Joints: The list of twist joints to affect by the source rotation. Weight is used for different distributions.
- Twist Forward Axis: The forward axis for the twist joints, one that points along the twist axis towards the segment end.
- Twist Up Axis: The up axis for the twist joints, one that matches the segment end up axis.
The simulated body’s bone spacing of the spine and arms can change after updating the body state. The original spacings are recorded at the Start and are used to correct the spacing after updating the body state. The Deformation Logic component adjusts these spacings and it has the option to skip the Head or Hips for specific reasons.
Properties
- Weight: Used to reposition the lower arm bone according to the upper arm bone.
- Move Speed: If using Move Towards, the hands will be smoothly repositioned instead of instantly.

You can use AC_AndroidPermissions in your project to help you request permissions on the device:
- In the Movement SDK sample, go to Content > Blueprints > AC_AndroidPermissions and migrate this to your project. To do this, right-click the asset and use Asset Actions > Migrate to move this to your project.
- In your project, add the AC_AndroidPermissions as a component to your GameMode.
- During your GameModeBeginPlay, you should request the relevant permissions, by calling the RequestPermissions method on the component.
You can take a look at our BP_MovementSampleGameMode to see how this can be setup.