Design

Hands

Updated: Dec 12, 2025
Discover how hands can be used to enhance experiences. This page focuses on hands as an input method for immersive experiences.

Usage

Humans use their hands to interact effectively with their surroundings, facilitating tasks such as tool usage, communication, and tactile exploration.
Similarly, in immersive experiences, hands are a vital input method. They facilitate natural interaction by reflecting their real-world role and extending it with gesture-based indirect control of virtual content.
GIF of hand usage

Terminology

These are the different parts, characteristics and frequently used terms to be familiar with:
Diagram of a hand with terms labeled
Joints
Joints in the hand provide flexibility and mobility. In XR these are referenced in code for determining interactions. See image above.
Wrist
The joint connecting the hand with the forearm, often tracked in code for hand position.
Fingers and thumb
The five digits of the human hand, consisting of four fingers and one opposable thumb, are essential for performing tactile interactions.
Jitter
High-frequency, small, and often undesirable movements or shakiness in the virtual representation of the hand, even when the user's real hand is held still or moving smoothly.
Inside Out Body Tracking (IOBT)
A computer vision-based technology used in XR devices, such as Meta Quest headsets, to estimate and track the user's body pose, using only the cameras built into the headset.

Interaction SDK overview

The Interaction SDK (Unity, Unreal) or ISDK is a powerful toolkit designed to simplify and enhance the integration of hand tracking and hand-based interactions in immersive experiences. By leveraging the ISDK, developers can easily add best practice interactions to their applications, ensuring a high level of polish and usability right out of the box. All interaction components included in the SDK have been meticulously fine-tuned, tested against multiple prototypes, and validated through user research, representing the current state of the art in interaction design. For a more detailed table regarding what interactions are available see, ISDK interactions availability by engine.

Benefits of interactions SDK

  • Save developer effort and time: Use ISDK in your app and abstract away the hard work of designing a polished interaction system. ISDK makes it faster and easier to build interactions that work for both controllers and hand tracking out of the box.
  • Use only what you need: Interaction SDK is a modular library, which means you only use the components you need to realize your experience and customize them based on your needs.
For a comprehensive overview of all input modalities and their corresponding input primitives, please visit the following page: input primitives

ISDK features and interactions availability by engine

Below is a breakdown of the features and interactions available in the interaction SDK.

ISDK features

FeaturesDescriptionUnityUnrealSpatial SDK
Hand Tracking
Hand tracking enables the use of hands as an input method
✔
✔
✔
Fast Motion Mode (FMM)
Provides improved tracking of fast movements common in fitness and rhythm apps (60Hz)
✔
✔
✖
Wide motion Mode (WMM)
Allows you to track hands and display plausible hand poses even when the hands are outside the headset’s field of view
✔
✖
✖
Multimodal
Provides simultaneous tracking of both hands and controllers
✔
✖
✖
Capsense
Provides logical hand poses when using controllers.
✔
✔
✖
OpenXR Hand Skeleton
Support for the OpenXR hand skeleton standard
✔
✔
✖

ISDK interactions

InteractionsDescriptionUnityUnrealSpatial SDK
Grab
Enables users to grab virtual objects
✔
✔
✔
Hand Posing
Enables developers to define ideal hand poses for grabbed objects
✔
✔
✖
Grab Transform
Enables users to scale or rotate grabbed objects
✔
✔
✖
Grab Surface
Enables users to use grabbed objects
✔
✖
✖
Snap
Enables grabbed objects to snap to an ideal position when released
✔
✖
✖
Throw
Enables grabbed objects to be thrown on release
✔
✖
✖
Ray
Enables users to select buttons or scroll panels from a distance
✔
✔
✔
Poke
Enables users to poke buttons or scroll panels
✔
✔
✔
Distance Grab
Enables users to grab virtual objects from a distance
✔
✔
✖
Pose Detection
Enables developers to detect when users make specific poses
✔
✖
✖
Teleport Locomotion
Enables users to move across a virtual space and turn by teleporting
✔
✖
✖
Smooth Locomotion
Enables users to move smoothly across a virtual space and turn
✔
✖
✖
Touch Grab
Enables a more physics-based approach to grabbing
✔
✖
✖
Gesture Detection
Enables developers to detect when users make specific gestures
✔
✖
✖
Microgesture
Enables users to perform microgestures
✔
✖
✖
For more detailed design information and best practices on the hand within Interaction SDK, see Hand Interaction Types

Design

This section offers guidance on hand-based interaction techniques. Discover input primitives, understand design principles, consider ergonomic factors, and learn the essential dos and don’ts.

Interactions

Discover various input capabilities and interaction methods utilizing hands as input modality:
Targeting
Target objects using the hands in two ways: directly, similar to real-life, by touching an interactable or indirectly using a ray cast.
Selection
Letting the user choose or activate that interactable with the hand by performing for example a tap gesture.
Pose
A static moment where the joints of the hand are coordinated into a pose/orientation to communicate a specific action or command. Examples are: pinch, grab and custom poses.
Gesture
Refers to a specific movement (pose sequences) made by the hand. For example, a swipe gesture, waving the hand in a direction.
Gate
Gating transitions the hand from an "idle" state to an "active" state. These transitions are performed using a pose or gesture.

Design principles

This section explores the fundamental concepts that shape intuitive and user-friendly interactions for the hands input modality.
Usability
The presence of hands sets user expectations to at least mimic real-world capabilities and potentially grant superpowers. Users anticipate the ability to pick up objects, push buttons, and execute actions beyond the bounds of reality.
Accessibility
Hand-based inputs should be designed with accessibility in mind. For users with limited hand mobility, alternative input methods or adaptive controllers should be provided.
Multimodal
Multimodal interactions refer to providing multiple input modalities, such as controllers, hands, voice, and more, for the user to interact. By incorporating multiple modalities into an application, users can enjoy the experience by naturally choosing the interaction method that serves them best in the given moment. This creates a more seamless and enjoyable experience, as well as increased accessibility.
Affordances
It's crucial to provide distinct affordances, feedback, and signifiers. For instance, when a user interacts with an object, such as picking it up, they should receive immediate and clear visual and auditory cues confirming the successful completion of the action. This enhances the user's understanding and interaction within the environment.
Remember that hands are not controllers
It’s very tempting to simply adapt existing interactions from input devices like the controllers, and apply them to hand tracking. But that process will limit one to already-charted territory, and may lead to interactions that would feel better with controllers while missing out on the benefits of hands. Instead, focus on the unique strengths of hands as an input and add devices or virtual tools to empower the user to be more successful.
Constraints to improve usability
Hands are practically unlimited in terms of how they move and the poses they can form. This presents a world of opportunities, so pay special attention to the included guidelines that navigate limitations.

For example: components like the hand UI, have recommended placements for the UI at certain areas on the hand. These areas are recommended because hand tracking can have difficulty if one hand occludes the other hand too much. These limitations help increase accuracy, and actually make it easier to navigate the system or complete an interaction.

Mapping controller interactions to hands

Below are some recommendations for mapping controller interactions to hands:
ControllerInteractionHands
Touch with controller tip
Selection (direct)
Touch with index finger tip
Trigger Button
Selection (indirect)
Index Tap (Pinch)
Grab Button
Grab/Hold (direct and indirect)
Palm Grab or Hold pinch
Thumbstick
Locomotion
Microgesture
Menu Button
Open/Close Menu
System Gesture Right Hand
Menu Button
open/close App Menu
System Gesture Left Hand
A Button
Jump
Microgesture
B Button
Different per App/Context
Different per App/Context
X Button
Different per App/Context
Different per App/Context
Y Button
Different per App/Context
Different per App/Context

Comfort

Position

When designing experiences, it’s important to make sure the user can remain in a neutral body position as much as possible. Ideally, users should be able to interact with the system while keeping the arms close to the body and the elbows in line with the hips. This allows for a more comfortable experience, while keeping the hand in an ideal position for the tracking sensors.

Distance

Interactions should minimize muscle effort, so try not to make people reach too far from their body too frequently. When arranging information in virtual space, place the features a user will interact with most often closer to their body. The less important something is, the farther from the body it can be placed.

Next steps

More design resources on hands

Hand interaction examples

Designing experiences

Explore more design guidelines and learn how to design great experiences for your app:

Developing experiences

For technical information, start from these development guidelines:

Meta Spatial SDK

Unity

Unreal

Did you find this page helpful?
Thumbs up icon
Thumbs down icon