Learn about the diverse hands available in the Quest SDK and key factors to consider when designing a hands experience.
Definition
Hand representation in XR refers to the visual depiction of users’ hands within an immersive experience. It encompasses the various appearances and functionalities that virtual hands can take on to create an immersive and intuitive interaction between users and virtual content.
At its core, hand representation serves two essential functions:
Sense of Embodiment: It provides users with a tangible representation of their hands, which allows them to feel confident in expressing themselves and interacting using their hands in an immersive experience.
Functional Clarity: Hand representations communicate to users what their tracked hands are capable of within a given immersive experience, enhancing their understanding of interaction possibilities.
Your first instinct might be to create a realistic representation of a human hand, but this can be an expensive and difficult endeavour. Since realistic hands are challenging to get right or to perfectly match a user’s real hands, they can make users feel uncomfortable or disconnected from their virtual selves. Instead, think about what works best for the experience you’re building.
If the visual representation of hands is an important part of your immersive experience, then it’s essential to ensure the representation is either generic enough for anyone to feel embodied (like an octopus hand) or can be customized by users to suit their needs.
In contexts where your hands are primarily an input rather than a part of the experience, we’ve found that a functionally realistic approach is ideal.
A functionally realistic approach is one where the representation itself lets people know what they can do with their hands within a given experience without requiring a level of detail that is hard to produce and easy to get wrong.
Examples of different degrees of hand detail.
Currently, Meta Quest uses three main types of hand representations:
Avatars are a digital representation of you. Meta Avatars are available across all first-party Meta experiences, such as immersive home and workrooms. You can use the Movement SDK to create custom avatars or use the Avatars SDK to create Meta Avatars. By implementing the Avatars SDK into your experience, you not only provide users with hands but also offer them a visual representation of their body when they glance down or look in a mirror.
Users first encounter avatar hands in Quest Home. Avatar hands are ideal when emphasizing self-presence, but they aren’t recommended in passthrough due to potential uncanny valley experiences.
The appearance of the avatar hands (skin color, accessories) is determined by how users customize their Meta Avatars.
Custom hands (Developer-defined)
This pair of hands comes with feedback enhancements and is customizable for your specific experience. Leverage the Interaction SDK for Meta custom hands, or start from scratch based on hand pose data via the Hands API.
The custom hands we designed consist of two elements — a fill and an outline.
For the fill, we use a fresnel blending two dark grey tones to give it a subtle light effect and a constant slight transparency. Both effects give the hand presence in bright environments without obscuring what’s behind it.
The outline is a translucent light grey, which gives the hand a contrast in dark environments without being too extreme.
Both the fill and outline fade out at the wrist, since the wrist’s angle isn’t tracked and regularly breaks immersion and presence.
Additionally, this hand shader is crafted to incorporate two types of feedback to enhance interactions:
Progress feedback offers the user a visual indicator, indicating that this gesture can be used for interaction.
The outline is light grey when your hand is open. Once...
you begin to curl a finger for actions like pinching or grabbing
your finger gets closer for actions like touching a virtual object
... the outline of the specific fingers becomes brighter.
Gating feedback
Gating feedback provides the user with a visual indicator, signifying that the hand is in a specific pose or area to initiate an action. An example of gating feedback is the System Gesture.
When...
your hand intakes a certain pose for triggering something. Example: System Gesture
your hand enters an area where a specific gesture can be recognized. Example: Swipe gesture
... the outline turns blue. The pinching fingers turn light blue as you start to pinch.
Hands in passthrough
In passthrough mode, digital contents are rendered and overlaid on top of the environmental background provided by passthrough. This includes the user’s physical hands since they are also part of the passthrough video.
This creates a unique design challenge when the user interacts with virtual objects, because their physical hands (passthrough video) will be occluded by the virtual objects even if the hands are closer to the user and in front of the objects.
The user’s hands being occluded by virtual objects.
Overlaying the custom hands onto the physical hands.
Using Quest’s handtracking data, we can overlay the defaults onto the user’s physical hands, just like in VR experiences. By doing this, the user can clearly see the location and movements of the hands and joints, enabling them to interact with virtual objects.
However, this solution can degrade the immersion since the physical hands in passthrough appear behind virtual objects and the overlaid hands appear above virtual objects. Therefore we recommend only blending the virtual hands when the user is near virtual objects.
This misalignment is exaggerated when the user moves their hands closer to their face and is minimized when the user has outstretched arms. Misalignment between the virtual hand and hand in passthrough may be less noticeable depending on the hand position and orientation.
Show the user’s physical hands via masking
Showing physical hands via masking.
We can use the hand tracking’s mesh data as a mask for the virtual objects. By doing this, we can make the area transparent where the hands overlap the virtual content, which means the user can see their physical hands.
One of the challenges of using masking is the misalignment between the tracked hand and the user’s real hand. Since the mask is applied based on the tracked hand (not the hands seen through the passthrough feed), the size and position could have some offset, which breaks the immersion.
Another challenge is misalignment from latency when the hands are moving. This misalignment could become severe as the hands move faster.
One of the solutions could be making the edge of the masking blurry. Smooth edges could reduce the visual contrast coming from the misalignment.
We recommended this method only for small virtual objects, and only when there’s no need for precise interactions, since the user will aim with their physical hand, which is different than the tracked hand the system sees. For example, as shown in the screenshot below, grabbing an object wouldn’t be a problem. However, interactions that require higher precision, such as pressing a small button UI, could be challenging because the fingertips on the virtual hand and physical hand would be misaligned.
Blend the masking and virtual hand mesh outline
The custom hands and physical hands blended together.
By combining the two design solutions above, we can provide a good balance of immersion and usability, although misalignment of the virtual and passthrough hand may still occur depending on the position of the user’s hand.
As the user brings their hand closer to their face, the distortion of the passthrough feed increases, resulting in greater divergence between the virtual hand and physical hand.
Users interacting via direct touch might become confused if both the virtual and passthrough hands are visualized. If the two hands are not closely aligned, it might be unclear which of the two hand representations is targeting the selection and performing the interaction. You can also consider dynamically increasing the opacity and strength of the virtual hand visualization as it moves closer to the target object.
Dos and don’ts
A list of things to keep in mind that make your hands experience empowering or will detract from the experience:
Do
Select a hand representation that suits your experience, whether it’s immersive or mixed with the user’s physical surroundings.
Consider the reserved system gestures when designing Hand interactions.
Don’t
Overlay a virtual avatar hand over passthrough feed.
When possible, don’t overlay the virtual hand over the passthrough feed of the physical hands. The hands will be misaligned, which might confuse the user about which hand is performing the interaction, and it might be difficult to interact via direct touch.