This sample demonstrates how to integrate the Meta Avatars SDK with the Meta XR Interaction SDK to create realistic hand-driven avatar interactions in Unity. You explore hand tracking delegation, coordinate space conversion, and interaction patterns for both grab and poke interactions.
Bridge ISDK hand tracking data to the Avatars SDK using delegate interfaces
Convert coordinate spaces between Unity, ISDK, and Avatar SDK coordinate systems
Implement grab and poke interactions with avatar hand representations
Handle version compatibility between different SDK releases
Choose between OVR API and OpenXR hand tracking backends
Requirements
Device: Meta Quest 2, Quest 3, or Quest 3S with hand tracking enabled
Development environment: Unity 2022.3.11f1 or newer, with Meta XR SDKs installed
For platform setup instructions, see the Unity development setup guide.
Get started
Clone or download the sample from the Unity-MetaXRInteractionSDK-AvatarSample repository. Open the project in Unity 2022.3.11f1 or newer. Load either the AvatarGrabExamples or AvatarPokeExamples scene from the Assets/Scenes/ folder, then build and deploy to your Quest device following the build instructions in the repository README. The sample loads your Meta avatar from the CDN and falls back to a preset avatar if loading fails.
Explore the sample
File/Scene
What it demonstrates
Key concepts
AvatarGrabExamples.unity
Hand-grab interactions with physical objects using avatar hand representations
Grabbable interactables, hand anchors, ISDK-to-Avatar hand tracking bridge
AvatarPokeExamples.unity
Poke interactions with UI canvases (flat, curved, scrollable) using avatar fingers
PointableCanvasModule integration, pokeable surfaces, UI interaction with avatar hands
Minimal avatar entity with CDN loading and fallback logic
SampleAvatarEntity, OvrAvatarEntity configuration, user avatar loading
HandTrackingInputManager.cs
Bridge between ISDK hand tracking and Avatar SDK
Delegate creation, version branching, provider wrappers vs BodyTracking
InteractionAvatarConversions.cs
Coordinate space conversion utilities
PoseToAvatarTransform methods, axis flips for different contexts
HandTrackingDelegate.cs
OVR API hand tracking implementation
17-joint mapping, Z-flip for wrists, X-flip for rotations
OpenXRHandTrackingDelegate.cs
OpenXR hand tracking implementation
HandMirroring.HandSpace conversion, explicit hand space definitions
Runtime behavior
When you run either scene, your Meta avatar’s hands mirror your physical hand movements tracked by the Quest device. In AvatarGrabExamples, you reach out and grab objects with natural hand poses. The avatar hands maintain contact and grip visualization throughout the interaction. In AvatarPokeExamples, you poke buttons and interact with UI canvases using your index finger, with the avatar hand following your movements and triggering UI responses on contact. The avatar loads from the Meta CDN and displays your personalized avatar if available, otherwise it falls back to a preset avatar.
Key concepts
Delegate-based hand tracking bridge
The sample bridges ISDK hand tracking to the Avatars SDK through delegate interfaces rather than direct API calls. HandTrackingInputManager creates instances of IOvrAvatarHandTrackingDelegate and IOvrAvatarInputTrackingDelegate, which the Avatar system polls each frame for hand and headset pose data. This pattern decouples the two SDKs while allowing real-time data flow.
See HandTrackingInputManager.cs in the sample repository.
Coordinate space conversion
The sample uses three conversion methods depending on the data context. For world-space tracking positions, PoseToAvatarTransform performs a direct 1:1 conversion. For wrist positions, PoseToAvatarTransformFlipZ applies a Z-axis flip. For joint rotations, UnityToAvatarQuaternionFlipX negates the Y and Z quaternion components. These conversions account for the coordinate system differences between Unity and the Avatar SDK’s CAPI layer.
public static CAPI.ovrAvatar2Quatf UnityToAvatarQuaternionFlipX(Quaternion quat)
{
return new CAPI.ovrAvatar2Quatf
{ w = quat.w, x = quat.x, y = -quat.y, z = -quat.z };
}
See InteractionAvatarConversions.cs in the sample repository.
Version-compatible delegate registration
The sample supports both older and current Avatars SDK versions using preprocessor branching on AVATARS_29_7_OR_NEWER. SDK version 29.7 and newer use provider wrappers registered in OnTrackingInitialized, while older versions poll the BodyTracking property directly. This pattern allows the sample to compile and run across different SDK releases without code duplication.
See the Start() and OnTrackingInitialized() methods in HandTrackingInputManager.cs in the sample repository.
Dual hand tracking backend support
The sample implements two hand tracking backends: OVR API and OpenXR. HandTrackingDelegate uses the OVR API path with 17 joints per hand and specific axis flips, while OpenXRHandTrackingDelegate uses explicit hand space definitions and root offsets. The active implementation is selected at compile time via the ISDK_OPENXR_HAND preprocessor symbol, allowing the same codebase to target different XR runtimes.
See HandTrackingDelegate.cs and OpenXRHandTrackingDelegate.cs in the sample repository.
Prefab-based integration architecture
The sample packages the integration into two reusable prefabs. OculusInteractionAvatarSdkManager bundles OvrAvatarManager, LOD management, GPU skinning, and HandTrackingInputManager with serialized fields for _hmd, _leftHand, and _rightHand references. Avatar provides a minimal entity with SampleAvatarEntity configured for CDN loading and preset fallback. This prefab architecture allows you to drop the integration into new projects without manual component wiring.
See the prefabs in the Assets/Prefabs/ folder in the sample repository.
Extend the sample
Modify SampleAvatarEntity.cs to customize avatar loading behavior, such as adding custom retry logic, loading specific preset avatars, or handling avatar creation events
Combine the hand tracking delegation pattern with custom ISDK interactions by creating new interactable objects in the grab or poke scenes and wiring them to existing interactors
For gesture-based interactions, explore the Interaction SDK Gesture Recognition Sample to add hand pose detection alongside avatar representation