Develop

Unity Movement sample overview

Updated: May 11, 2026

Overview

The Unity Movement sample demonstrates body and face tracking retargeting, AI-driven motion synthesis, and networked avatars using the Meta XR Movement SDK. It supports multiple manifestations (half-body and full-body) and includes networking via NGO or Photon Fusion 2.

Key features

FeatureImplementationDescription
Body tracking
CharacterRetargeter pipeline
Source data → processors → retargeter → target processors
Face tracking
OVRFaceExpressions + FaceDriver
63 facial expressions with A2E system
AI motion synthesis
Neural locomotion blending
Blends neural locomotion with live tracking
Networking
NGO or Photon Fusion 2
Multiplayer avatar synchronization
Calibration
T-pose
Initial body calibration for retargeting accuracy

Retargeting pipeline

ISourceDataProvider → Source Processors → SkeletonRetargeter → Target Processors
  • CharacterRetargeter orchestrates the full pipeline
  • ISourceDataProvider supplies raw tracking data
  • Source processors clean and filter input
  • SkeletonRetargeter maps source to target skeleton
  • Target processors apply final adjustments

Target processor types

ProcessorPurpose
Twist
Distributes rotation across twist bones
Animation
Blends tracking with authored animations
Locomotion
Drives root motion from tracking
CCDIK
Cyclic Coordinate Descent inverse kinematics
HandIK
Hand-specific IK adjustments
HipPinning
Stabilizes hip position
Custom
User-defined processing

Face tracking

  • OVRFaceExpressions provides 63 facial expressions
  • FaceDriver maps expressions to blend shapes
  • A2E system (Audio to Expression) for speech-driven animation

Additional features

  • AI motion synthesizer blending neural locomotion with live tracking data
  • T-pose calibration for accurate body proportions
  • Manifestations: half-body and full-body representations
  • Eight sample scenes demonstrating different capabilities

Requirements

  • Quest 2, Quest 3, or Quest 3S
  • Unity 6000.0.66f2
  • Meta XR Core SDK v81

Getting started

git clone https://github.com/oculus-samples/Unity-Movement.git
  1. Open the project in Unity 6000.0.66f2
  2. Install Meta XR Core SDK v81
  3. Enable body and face tracking on your Quest device
  4. Build and deploy to Quest