Develop

Hand Sample overview

Updated: May 11, 2026

Overview

This sample demonstrates hand tracking configuration, custom hand meshes, controller visualization, and input mapping for Meta Quest in Unreal Engine. It includes three maps covering hand tracking fundamentals, custom hand mesh workflows, and simultaneous controller+hand tracking. The project uses the OculusXR plugin with OpenXR hand tracking and 18 Enhanced Input Actions.

What you will learn

  • Configure OpenXR hand tracking with the OculusXR plugin
  • Apply custom hand meshes using the skeletal mesh workflow
  • Enable simultaneous hand and controller tracking (ControllersAndHands mode)
  • Map capacitive touch inputs for finger state detection
  • Set up Enhanced Input Actions for VR hand and controller input

Requirements

  • Meta Quest 2, Quest 3, or Quest 3S
  • Unreal Engine configured for Meta Quest development
  • OculusXR plugin installed and enabled
For setup instructions, see the Meta Quest Developer Hub documentation.

Get started

Clone the repository from GitHub, open the project in Unreal Engine, and build for Android. The project contains three maps demonstrating progressively complex hand tracking features. For detailed build and configuration steps, see the project README.

Explore the sample

MapWhat it demonstratesKey concepts
HandTrackingSample
Basic hand tracking setup and visualization
OpenXR hand tracking, OculusXR plugin, bone visualization
HandTrackingCustomSample
Custom hand meshes replacing default hand rendering
l/r_hand_skeletal_lowres meshes, skeleton remapping
ControllerHandSample
Simultaneous controller and hand input
ControllersAndHands mode, capacitive touch, input switching

Runtime behavior

When running on a Meta Quest device, the HandTrackingSample map displays tracked hand skeletons using the OculusXR plugin’s OpenXR hand tracking. The HandTrackingCustomSample map replaces default rendering with custom skeletal meshes (l_hand_skeletal_lowres, r_hand_skeletal_lowres) that follow tracked bone positions. The ControllerHandSample map enables the ControllersAndHands tracking mode, allowing seamless switching between controller input and hand tracking. Capacitive touch inputs detect finger positions (ThumbUp, Pointing) without requiring full button presses. The project includes 13+7 hand animations and 23 controller animations per side.

Key concepts

OpenXR hand tracking

The sample configures hand tracking through the OculusXR plugin using OpenXR APIs:
// Enable hand tracking in Project Settings > OculusXR > Input
// Set Hand Tracking Support to "Controllers And Hands" for simultaneous mode
// The OculusXR plugin handles OpenXR hand tracking extension negotiation

Custom hand mesh workflow

Custom hand meshes follow the skeletal mesh workflow using low-resolution hand models:
// Custom meshes: l_hand_skeletal_lowres, r_hand_skeletal_lowres
// Meshes are bound to the OculusXR hand skeleton
// Bone transforms from OpenXR drive the skeletal mesh pose each frame

Capacitive touch input

Capacitive touch sensors on Meta Quest controllers detect finger proximity without requiring a full press:
// Capacitive touch inputs:
// ThumbUp - thumb lifted from thumbstick/buttons
// Pointing - index finger lifted from trigger
// These map to Enhanced Input Actions for gesture detection

Enhanced Input Actions

The project defines 18 Enhanced Input Actions covering both hand tracking and controller inputs:
// Input actions cover:
// - Hand tracking bone poses
// - Controller button presses
// - Capacitive touch states (ThumbUp, Pointing)
// - Trigger and grip analog values
// - Thumbstick axes

Extend the sample

  • Replace the low-resolution hand meshes with custom-modeled hands for your application.
  • Add new capacitive touch gestures by combining ThumbUp and Pointing states.
  • Implement hand-to-controller transition animations for smooth mode switching.
  • Extend the Enhanced Input mapping context with application-specific actions.