Unreal Movement sample overview
Updated: May 11, 2026
This sample demonstrates the OculusXRMovement plugin ecosystem, including body, eye, and face tracking integration for Meta Quest headsets. It provides three custom Animation Graph nodes, a complete retargeting system with four modes, and five pawn variants that showcase different tracking configurations.
- Integrate body, eye, and face tracking using the OculusXRMovement plugin
- Use three Animation Graph nodes: FAnimNode_OculusXRBodyTracking, FAnimNode_OculusXREyeTracking, and FAnimNode_OculusXRFaceTracking
- Configure four retargeting modes (RotationAndPositions, RotationOnlyUniformScale, RotationOnly, and None)
- Apply face tracking corrective shapes including combination and in-between shapes
- Use per-expression modifiers to fine-tune tracking output
- Generate bone mappings with the GenerateBoneMapping() utility
- Visualize tracking data with five debug draw modes
- Meta Quest 2, Quest 3, or Quest 3S
- Unreal Engine 5.6 with the
MetaXR plugin
Clone or download the sample from the GitHub repository. Open the project in Unreal Engine 5.6 with the MetaXR plugin installed. Build and deploy to your Meta Quest device. The sample includes four maps that demonstrate different tracking configurations. Start with MAP_Aura for a general overview, then explore the high-fidelity maps for advanced retargeting examples.
| File / Scene | What it demonstrates | Key concepts |
|---|
MAP_Aura (map) | General body and face tracking overview | OculusXRMovement plugin basics, pawn setup |
MAP_HighFidelity (map) | High-fidelity body tracking with retargeting | Retargeting modes, bone mapping |
MAP_HighFidelity_AnimBlueprint (map) | Animation Blueprint-driven high-fidelity tracking | Animation Graph nodes, Blueprint integration |
MAP_RetargetMannequinAnimBlueprint (map) | Retargeting to Mannequin skeleton | Cross-skeleton retargeting, GenerateBoneMapping() |
FAnimNode_OculusXRBodyTracking | Body tracking Animation Graph node | Retargeting modes, skeletal mapping |
FAnimNode_OculusXREyeTracking | Eye tracking Animation Graph node | Gaze direction, eye openness |
FAnimNode_OculusXRFaceTracking | Face tracking Animation Graph node | Blend shapes, corrective shapes, per-expression modifiers |
The sample provides five pawn variants, each configured with a different combination of body, eye, and face tracking. At runtime, the Animation Graph nodes receive tracking data from the OculusXRMovement plugin and apply it to the character skeleton using the configured retargeting mode.
Body tracking supports four retargeting modes that control how tracking data maps to skeletons of different proportions. Face tracking applies corrective shapes (combination and in-between) on top of base blend shapes for more realistic facial animation. Five debug draw modes visualize raw tracking data, retargeted poses, and bone mappings.
The sample separates plugin code from sample code. The OculusXRMovement plugin provides low-level tracking access and the three Animation Graph nodes. The sample layer demonstrates how to configure those nodes, set up retargeting, and combine tracking types in complete pawn setups.
Four retargeting modes control how body tracking data maps to target skeletons:
- RotationAndPositions: Full position and rotation retargeting for matching skeletons
- RotationOnlyUniformScale: Rotation with uniform scale compensation for different skeleton sizes
- RotationOnly: Rotation only, preserving target skeleton proportions
- None: Raw tracking data without retargeting
Face tracking corrective shapes
Face tracking uses corrective blend shapes that activate based on combinations of base expressions:
// Combination shapes activate when multiple base shapes exceed thresholds
// In-between shapes activate at specific ranges of a base shape
// Per-expression modifiers scale individual expression values before applying
The GenerateBoneMapping() utility automates bone name matching between source tracking data and target skeleton hierarchies.
- Add custom retargeting logic for non-humanoid characters using the Animation Graph node outputs.
- Combine face tracking with lip sync for more realistic speaking animations.
- Create new pawn variants that blend multiple tracking types with gameplay animations.
- Use per-expression modifiers to create stylized or exaggerated facial animation.