Develop
Develop
Select your platform

Hand Tracking in Unreal Engine

Updated: Sep 15, 2025

Overview

The hand tracking feature enables the use of hands as an input method on Meta Quest headsets. It provides a new sense of presence, enhances social engagement, and can help deliver more natural interactions.
The hand tracking feature also allows you to develop UI elements that can be operated with hands and controllers interchangeably. When you opt to use hands, for near-field interactions, users can use their hands to pinch or poke objects. For far-field interactions, the hand’s pose drives a laser cursor-pointer that behaves like the standard controller cursor. Use the cursor-pointer to highlight, select, click, or write your own app-level event logic.
Note that hand tracking complements Touch controllers, but is not intended to replace controllers in all scenarios, particularly with games or creative tools that require a high degree of precision.
Note
The recommended way to integrate hand tracking for Unreal developers is to use the Interaction SDK, which provides standardized interactions and gestures. Building custom interactions without the SDK can be a significant challenge and makes it difficult to get approved in the store.
Data Usage Disclaimer: Enabling support for hand tracking grants your app access to certain user data, such as the user’s estimated hand size and hand pose data. This data is only permitted to be used for enabling hand tracking within your app and is expressly forbidden for any other purpose.

Samples

To see examples of the hand tracking integration, check out the following:

Features

FeatureSupportedDescriptionSDKDocumentationSamples
Tracking
 
 
 
 
 
Hand Tracking
Hand tracking enables the use of hands as an input method
Meta Core SDK
Fast Motion Mode (FMM)
Provides improved tracking of fast movements common in fitness and rhythm apps (60Hz)
Meta Core SDK
 
 
Wide Motion Mode (WMM)
Allows you to track hands and display plausible hand poses even when the hands are outside the headset’s field of view
Meta Core SDK
 
 
Multimodal
Provides simultaneous tracking of both hands and controllers.
Meta Core SDK
 
 
Capsense
Provides logical hand poses when using controllers.
Meta Core SDK
 
OpenXR Hand Skeleton
Support for the OpenXR hand skeleton standard
Interaction SDK / Meta Core SDK
 
 
Poses & Gestures
 
 
 
 
 
Pose Detection
✅(v78+)
A pose is detected when the tracked hand matches that pose’s required shapes and transforms
Interaction SDK
 
 
Pose Recording
Capture a pose for use in pose detection.
Interaction SDK
 
 
Gesture Detection
Sequences can recognize a series of IActiveStates over time to compose complex gestures
Interaction SDK
 
 
Microgestures
Recognize thumb tap and thumb swipe motions performed on the side of the index finger.
Interaction SDK
 
Interactions
 
 
 
 
 
Poke
Interact with surfaces via direct touch using hands
Interaction SDK
 
Grab
Enable you to pick up or manipulate objects in the world using controllers or hands.
Interaction SDK
 
Hand Grab
Provides a physics-less means of grabbing objects that’s specifically designed for hands
Interaction SDK
 
Distance Grab
Lets you use your hands to grab and move objects that are out of arm’s reach
Interaction SDK
 
Ray Grab
Object can be interacted with by the user from a distance by casting a ray out from the hand or controller.
Interaction SDK
 
Custom Grab Poses
Record a custom hand grab pose to control how your hands conform to a grabbed object
Interaction SDK
 
Throw
Enable throwing objects using hands.
Interaction SDK
 
Raycast
Interact with objects in the world from a distance by casting a ray, or line, out from the hand or controller.
Interaction SDK
 
2D Widget Interaction
Handles all the scaffolding and plumbing necessary to display a Widget Blueprint in the world and make it interactable.
Interaction SDK
 
Visuals
 
 
 
 
 
Custom hand models
Replace the default Interaction SDK hands with your own set of custom hands
Interaction SDK
 
Did you find this page helpful?
Thumbs up icon
Thumbs down icon